Fresh content has had a negative affect on SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Howdy! Just wanted to point out this question is several months old.
The statement about "The same content in another language can trigger duplicate content issues" was a bit surprising to hear. Can you provide some more information about that?
I'm more accustomed to what Google says in places like http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#3 where they state "Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag."
Matt Cutts also talks about it being OK in a video at http://www.youtube.com/watch?v=UDg2AGRGjLQ.
I'm also interested in knowing more about content needing to be relative to the directory name. Can you give a few more details?
-
Hello, You will not rank well on G without backlinks, sometimes new sites do get a temp boost early in their life because of the fresh content. The same content in another language can trigger duplicate content issues, also look into your directory/url addresses because the content should be relative to the directory name etc. Hope you figure it out, if worse comes to worse you can also roll back your changes and observe ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this approach of returning different content depending on IP beneficial for international SEO?
I've decided to use sub folders for my site, and from everything I've read online it seems I shouldn't change the page content depending on IP, yet I know of a successful well funded site that hires full time SEO staff that does just that, and I'm wondering whether they know something I don’t which is helping their SEO. From everything I've read online this is the format I think I should use: mysite.com/us/red-wigs mysite.com/gb/red-wigs mysite.com/red-wigs does not exist This is the format the other site is using: othersite.com/red-wigs (from US IP address) othersite.com/red-wigs (from UK IP address) othersite.com/gb/red-wigs The content on othersite.com/red-wigs is identical to othersite.com/gb/red-wigs when loading from a UK IP address, and a lot of URLs without /gb/ are being returned when searching google. The benefit I can think of that they are gaining is US pages which are being returned for UK based searches will return the correct content. Are their any other gains to this approach? I'm concerned that if I use this approach for different languages then the radically differing content of othersite.com/red-wigs depending on the location of the crawler might confuse google - also generally changing content depending on IP seems to be recommended against. Thanks
International SEO | | Mickooo0 -
Duplicate content across English-speaking ccTLDs
Morning, If a brand offering pretty the same products/services has 4 English-speaking ccTLDs (.com, .co.uk, .com.au and .co.nz), what are the best practices when thinking about SEO and content? In an ideal world, all content should be totally unique, but when the products/services offered across every ccTLD are the same, this may prove tricky. Am I right in thinking that duplicate content across ccTLDs is tolerated by Google as they know you're targeting specific countries? Cheers!
International SEO | | PeaSoupDigital0 -
Is International Geotargeting with Duplicate Content Effective?
A company located in Canada is currently targeting Canada through the geotargeting setting in Google Webmaster Tools. Google.ca rankings are good, but Google.com rankings are not. The company would like to gain more traction for US people using google.com. The idea on the table is to set up a subfolder www.domain.com/us/ and use WMT to designate this version for the US. Here's the kicker: the content is exactly the same. Will Google consider the US version duplicate content? Is this an effective way to target US and Canada at the same time? Is it better to forget a duplicate US site altogether and use the "unlisted" setting in WMT?
International SEO | | AliveWired0 -
Do you think the SEs would see this as duplicate content?
Hi Mozzers! I have a U.S. website and a Chinese version of that U.S. website. The China site only gets direct and PPC traffic because the robots.txt file is disallowing the SEs from crawling it. Question: If I added English sku descriptions and English content to the China site (which is also on our U.S. site), will the SEs penalize us for duplicate content even though the robots.txt file doesn’t allow them to see it? I plan on translating the descriptions and content to Chinese at a later date, but wanted to ask if the above was an issue. Thanks Mozzers!
International SEO | | JCorp0 -
Duplicate content international homepage
Hi, We have a website which is in english and dutch language. Our website has the following structure www.eurocottage.com:
International SEO | | Bram76
Dutch or English language ones the user has set his language in a cookie. www.eurocottage.com/nl/ :
Dutch language www.eurocottage.com/en/:
English language The .com and the eurocottage.com/nl/ and eurocottage.com have according to Google duplicate content because they are initial both in Dutch. What would be the best strategy to fix this problem? Thanks, Bram0 -
Is having duplicated content on different domains a problem when using alternate tag, but no canonical?
We will be launching a couple of new language versions. I understand that ccTLD is mostly considered as best option, however I thought that to start with it might be better to launch the new language version first on a subdirectory of our established domain with strong backlink profile as it may rank much better until I can attract some strong links to new ccTLD. I would wait for the pages of new language versions to be indexed on the main domain and then after a month launch the same content paralell on the ccTLD setting up an alternate tag in the main domain pointing to the ccTLD. I would not setup any canonical tag. As I understand google would rank whatever of the 2 versions ranks higher. Should not cause duplicated content issues right?
International SEO | | lcourse
Any thoughts? EDIT:
For clarification. The language we are launching are mostly spoken in several countries. E.g. for Portuguese I would add in main domain an altnernate tag for Brazilian visitors to Brazilian ccTLD, but no alternate tag for Portuguese visitors. For Corean I would add in main domain an alternate tag for visitors in south corea, but not one for visitors in north corea.0 -
Ranking well internationally, usage of hreflang, duplicate country content
I'm trying to wrap my head around various options when it comes to international SEO, specifically how to rank well in countries that share a language, and the risk of duplicate content in these cases. We have a chance to start from scratch because we're switching to a new e-commerce platform, and we were looking into using hreflang. Let's assume an example of a .com webshop that targets both Austria and Germany. One option is to include both language and region in the URL, and mark these as such using hreflang: webshop.com/de-de/german-language-content (with hreflang de-de)
International SEO | | DocdataCommerce
webshop.com/de-at/german-language-content (with hreflang de-at) Another option would be to only include the language in the URL, not the region, and let Google figure out the rest: webshop.com/de/german-language-content (with hreflang de) Which would be better? The risk of inserting a country, of course, is that you're introducing duplicate content, especially since for webshops there are usually only minor differences in content (pricing, currency, a word here and there). If hreflang is an effective means to make sure that visitors from each country get the correct URL from the search engines, I don't see any reason not to use this way. But if search engines get it wrong, users will end up in the wrong page and will have to switch country, which could result in conversion loss. Also, if you only use language in the URL, is it useful at all to use hreflang? Aren't engines perfectly able to recognize language already? I don't mention ccTLDs here because most of the time we're required to use a .com domain owned by our customer. But if we did, would that be much better? And would it still be useful to use hreflang then? webshop.de/german-language-content (with hreflang de-de)
webshop.at/german-language-content (with hreflang de-at) Michel Hendriks
Docdata Commerce0 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3