Is my landing page "over-optimized"? Please help
-
Hello out there
My website www.painterdublin.com and www.tilers-dublin.com were heavily hit by google panda update on 27.9.2012 and EMD update few days after. I lost about 70% of the traffic mainly from combination of the keywords from my domain name (painter dublin and tilers dublin) and never managed to recover from it.
I am wondering if I should also concentrate on rewriting the content of both home landing pages in the terms of "KEYWORD DENSITY". Do you think my content is "OVER OPTIMIZED" for my main keywords? (painter dublin, tilers-dublin). What is the correct use? Is there any tool to guide me?
I am aware I am using those terms quite often. I don't want to start deleting those terms before I know the right way to do it. Is there anybody willing to have a look at my sites and give me advice please?
kind regards
Jaro
-
I think you should focus less on your page content and more on how your website is coded and your inbound link profile.
The title tags and description tags on both of your sites contain special characters that shouldn't be there, like bullets and Check marks. Clean those up first.
On your tiling website you've included your phone number in the title tag, thats a definite no no and I've seen sites penalized for that in the past.
Just taking a brief peak at your inbound links, I'm guessing links from sites like this is more likely to be the reason for you problems than over optimization: blackhatpwnage.com/how-to-get-high-page-rank/
Your back link profile overall is very key word specific. That combined with all of the above is the more likely reason for your problems than "over optimization" of your keywords.
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need help on how best to do a complicated site migration. Replacing certain pages with all new content and tools, and keeping the same URL's. The rest just need to disappear safely. Somehow.
I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?
Intermediate & Advanced SEO | | brickbatmove1 -
Major Landing page removed from Google SERP and replace homepage URL.How do I fix it?
Hi Major Landing page removed from Google SERP and replace homepage URL.How do I fix it? In an SPA website (angularJS), Why it happens?
Intermediate & Advanced SEO | | cafegardesh0 -
Links / Top Pages by Page Authority ==> pages shouldnt be there
I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre
Intermediate & Advanced SEO | | Neckermann0 -
301 redirects broken - problems - please help!
Hi, I have a bit of an issue... Around a year ago we launched a new company. This company was launched out of a trading style of another company owned by our parent group (the trading style no longer exists). We used a lot of the content from the old trading style website, carefully mapping page-to-page 301 redirects, using the change of address tool in webmaster tools and generally did a good job of it. The reason I know we did a good job is that although we lost some traffic in the month we rebranded, we didn't lose rankings. We have since gained traffic exponentially and have managed to increase our organic traffic by over 200% over the last year. All well and good. However, a mistake has recently occurred whereby the old trading style website domain was deleted from the server for a period of around 2-3 weeks. It has since been reinstated. Since then, although we haven't lost rankings for the keywords we track I can see in webmaster tools that a number of our pages have been deindexed (around 100+). It has been suggested that we put the old homepage back up, and include a link to the XML sitemap to get Google to recrawl the old URLs and reinstate our 301 redirects. I'm OK with this (up to a point - personally I don't think it's an elegant solution) however I always thought you didn't need a link to the xml sitemap from the website and that the crawlers should just find it? Our current plan is not to put the homepage up exactly as it was (I don't believe this would make good business sense given that the company no longer exists), but to make it live with an explanation that the website has moved to a different domain with a big old button pointing to the new site. I'm wondering if we also need a button to the xml sitemap or not? I know I can put a sitemap link in the robots file, but I wonder if that would be enough for Google to find it? Any insights would be greatly appreciated. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
How to build links to landing pages?
I have been using link baits like infographics to get quality links to my site and I have observed that these tactics are great to get links to the home page or that particular post page where infographic was originally posted. But we have various other important landing pages and we want to transfer some link equity to those pages. Whenever we publish an infographic we post it on out blog with an embed code carrying anchor text pointed to our site’s home page. People who share our infographic, normally links to the home page or to the post page where they find that particular item. So, what are the possible ways to get links to any other landing page? Can we post some bait on other landing pages as well. I need to know some more techniques to attract deep links. Thanks
Intermediate & Advanced SEO | | shaz_lhr1 -
"Duplicate" Page Titles and Content
Hi All, This is a rather lengthy one, so please bear with me! SEOmoz has recently crawled 10,000 webpages from my site, FrenchEntree, and has returned 8,000 errors of duplicate page content. The main reason I have so many is because of the directories I have on site. The site is broken down into 2 levels of hierachy. "Weblets" and "Articles". A weblet is a landing page, and articles are created within these weblets. Weblets can hold any number of articles - 0 - 1,000,000 (in theory) and an article must be assigned to a weblet in order for it to work. Here's how it roughly looks in URL form - http://www.mysite.com/[weblet]/[articleID]/ Now; our directory results pages are weblets with standard content in the left and right hand columns, but the information in the middle column is pulled in from our directory database following a user query. This happens by adding the query string to the end of the URL. We have 3 main directory databases, but perhaps around 100 weblets promoting various 'canned' queries that users may want to navigate straight into. However, any one of the 100 directory promoting weblets could return any query from the parent directory database with the correct query string. The problem with this method (as pointed out by the 8,000 errors) is that each possible permutation of search is considered to be it's own URL, and therefore, it's own page. The example I will use is the first alphabetically. "Activity Holidays in France": http://www.frenchentree.com/activity-holidays-france/ - This link shows you a results weblet without the query at the end, and therefore only displays the left and right hand columns as populated. http://www.frenchentree.com/activity-holidays-france/home.asp?CategoryFilter= - This link shows you the same weblet with the an 'open' query on the end. I.e. display all results from this database. Listings are displayed in the middle. There are around 500 different URL permutations for this weblet alone when you take into account the various categories and cities a user may want to search in. What I'd like to do is to prevent SEOmoz (and therefore search engines) from counting each individual query permutation as a unique page, without harming the visibility that the directory results received in SERPs. We often appear in the top 5 for quite competitive keywords and we'd like it to stay that way. I also wouldn't want the search engine results to only display (and therefore direct the user through to) an empty weblet by some sort of robot exclusion or canonical classification. Does anyone have any advice on how best to remove the "duplication" problem, whilst keeping the search visibility? All advice welcome. Thanks Matt
Intermediate & Advanced SEO | | Horizon0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0