Fresh content has had a negative affect on my SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Hi Gary,
Not sure I can add anything not said here, but if you feel inclined to send me a PM with the URLs, I'd be more than happy to take a look at it.
-
Hi Cyrus,
I forgot to mention I put canonical and hreflang tags on both pages in question .co.uk & .ie and it now seems that Google has finally crawled the .ie page, unfortunately it's on Google.ie the keyword is no longer ranked, the good news is that When i paste the first paragraph of the .co.uk page into Google.co.uk it's the .co.uk page that appears not the .ie webpage.
Is there no way of the .ie version ranking at all in Google.ie? it seems a shame that Google cannot get this right.
Thanks
-
Hi Gary,
Good question. Could be a couple of things going on. Let me address each in turn.
1. Duplicate content and the Irish version of your site. Could be an issue if you're duplicating content, even with the hreflang tags. Google also recommends to use canonical tags on international versions in addition to hreflang tags if the the content is duplicated.
Good discussion here:
http://googlewebmastercentral.blogspot.fr/2011/12/new-markup-for-multilingual-content.htmland here:
https://plus.google.com/u/0/115984868678744352358/posts/9zA3a96XahN2. Fresh content. In general, fresh content will help your rankings. But there are a couple of things to look out for when updating your content.
- If the content changes significantly from the original, Google may interpret as contextually different, and alter it's ranking score.
- Same for title tags and other on-page factors. If these change too much from the original, Google may do a "reset" on the page, which basically says "this is an entirely new subject, so you have to earn your rankings again.
- Internal links. Sometimes our content ranks from the power of internal text links, and we can inadvertently change these when updating content. In the absence of strong external link signals, this effect can be strong.
If you kept your content, subject matter and internal links fairly consistent, there may be other factors at work, such as an algorythm update or the aforementioned dupe content issue.
Hope this helps. Best of luck with your SEO!
-
Hi Aaron,
I have done very little in terms of link building for these pages, and the pages I have got back links on they are from authority websites, so it makes it very unlikely that this is the cause of the issue.
I just can't figure out what the issue could be, I mean for all pages in that directory to not be ranked anymore, it just seems to much of a coincidence that all the pages in this directory had been re-written, then one week later they vanish from the SERPs.
Any more suggestions would be very grateful.
-
It really sounds like a mistaken black-listing, if the ranking has dropped "all together". It could be based on content, but if your bounce rate is dropping, your content is better, and if there was a mistake made, I think some bad back-links are more likely.
If this occurred at the end of February, it would predate PENGUIN, but black-listings do occur between updates. One idea for resolving it would be to check some of the lower domain-authority backlinks, go to their site, and then from there, check other sites they are linking to, and see if those sites have also suffered. Once you target the location of the bad-links, you can start your clean up from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help - Duplicate Content
Hi, I am really struggling to understand why my site has a lot of duplicate content issues. It's flagging up as ridiculously high and I have no idea how to fix this, can anyone help me, please? Website is www.firstcapitol.co.uk
Intermediate & Advanced SEO | | Alix_SEO1 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
How do you get these sitelinks in the SERPs?
How do you get these to appear - http://imgur.com/xV5LA6E Does a website have any control over what appears?
Intermediate & Advanced SEO | | EcommerceSite0 -
How to best handle expired content?
Similar to the eBay situation with "expired" content, what is the best way to approach this? Here are a few examples. With an e-commerce site, for a seasonal category of "Christmas" .. what's the best way to handle this category page after it's no longer valid? 404? 301? leave it as-is and date it by year? Another example. If I have an RSS feed of videos from a big provider, say Vevo, what happens when Vevo tells me to "expire" a video that it's no longer available? Thank you!
Intermediate & Advanced SEO | | JDatSB0 -
Negative SEO + Disavow
My site is very new (~1 years old), but due to good PR we have gotten some decent links and are already ranking for a key term. This may be why someone decided to start a negative SEO attack on us. We've had less than 200 linking domains up until 2 weeks ago, but since then we have been getting 100+ new domains /day with anchor texts that are either targeted to that key term or are from porn websites. I've gone through the links to get ready and submit a disavow... but should I do it? My rankings/site traffic has not been affected yet. Reasons for my hesitations: 1. Google always warns against using the disavow, and says "you shouldn't have to use it if you are a normal website." (sensing 'guilty-until-proven') 2. Some say Google is only trying to get the data to see if there are any patterns within the linking sites. I don't want the site owners to get hurt, since the villain is someone else using xrumer to put spammy comments on their site. What would you do?
Intermediate & Advanced SEO | | ALLee0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Wrong page in serps
Hi
Intermediate & Advanced SEO | | niclaus78
I've been working with a law firm's website for a couple of years and we've encounter a problem. The pages were divided to target employers and employees separately. For the very targeted keywords mentioning either employees or employers everything was good but for broader less targeted keywords e.g unfair dismissal keywords chooses either one or the other which is a problem. Now I created this ''bridge'' pages where all the topics are explained and then users are directed to and then they will chose where to go. the problem is a lot of off page was created during this years either targeting on or the other. What I plan to do is: -Create a new site map and changing the priority, so the new pages will have a priority 1 and the others less. - bookmarks, articles, etc will be targeting now to the new pages. I place the new pages linked from the home page so that they get the link juice of the home page and they are also now more a category page in the map, so a level up comparing to the previous ones. Questions: 1- Is it worthwhile adding a rel canonical tag to the new pages and rel alternate to previous pages, or if its not a question of duplicate content it shouldn't have an impact? What other things should I take into consideration? Thanks a lot. nico0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0