Fresh content has had a negative affect on SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Howdy! Just wanted to point out this question is several months old.
The statement about "The same content in another language can trigger duplicate content issues" was a bit surprising to hear. Can you provide some more information about that?
I'm more accustomed to what Google says in places like http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#3 where they state "Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag."
Matt Cutts also talks about it being OK in a video at http://www.youtube.com/watch?v=UDg2AGRGjLQ.
I'm also interested in knowing more about content needing to be relative to the directory name. Can you give a few more details?
-
Hello, You will not rank well on G without backlinks, sometimes new sites do get a temp boost early in their life because of the fresh content. The same content in another language can trigger duplicate content issues, also look into your directory/url addresses because the content should be relative to the directory name etc. Hope you figure it out, if worse comes to worse you can also roll back your changes and observe ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do hreflang attributes affect ranking?
We have a site in English. We are considering translating the site into Dutch. If we use a hreflang attribute does that mean we have to create a duplicate page in Dutch for each English page, or does Google auto-translate? How would duplicate pages, even if they are in a different language, affect ranking?
International SEO | | Substance-create0 -
Duplicate Content - International Sites - AirBNB
Good morning Just a quick question. Why does AirBNB not get penalised for duplicate content on its sites. For example, the following two urls (and probably more for other countries), both rank appropriately in the google (UK and COM), https://www.airbnb.co.uk/help/getting-started/how-to-travel
International SEO | | joogla
https://www.airbnb.com/help/getting-started/how-to-travel Their are no canonical tags, no Alternative etc If I look at the following https://www.airbnb.co.uk/s/London--United-Kingdom
https://www.airbnb.com/s/London--United-Kingdom They both have alternative to point to the other language versions which I would expect. However they also both point to them selves as canonical. Would this not be duplicate content ? Thanks for your insights Shane0 -
Do you think the SEs would see this as duplicate content?
Hi Mozzers! I have a U.S. website and a Chinese version of that U.S. website. The China site only gets direct and PPC traffic because the robots.txt file is disallowing the SEs from crawling it. Question: If I added English sku descriptions and English content to the China site (which is also on our U.S. site), will the SEs penalize us for duplicate content even though the robots.txt file doesn’t allow them to see it? I plan on translating the descriptions and content to Chinese at a later date, but wanted to ask if the above was an issue. Thanks Mozzers!
International SEO | | JCorp0 -
Is having duplicated content on different domains a problem when using alternate tag, but no canonical?
We will be launching a couple of new language versions. I understand that ccTLD is mostly considered as best option, however I thought that to start with it might be better to launch the new language version first on a subdirectory of our established domain with strong backlink profile as it may rank much better until I can attract some strong links to new ccTLD. I would wait for the pages of new language versions to be indexed on the main domain and then after a month launch the same content paralell on the ccTLD setting up an alternate tag in the main domain pointing to the ccTLD. I would not setup any canonical tag. As I understand google would rank whatever of the 2 versions ranks higher. Should not cause duplicated content issues right?
International SEO | | lcourse
Any thoughts? EDIT:
For clarification. The language we are launching are mostly spoken in several countries. E.g. for Portuguese I would add in main domain an altnernate tag for Brazilian visitors to Brazilian ccTLD, but no alternate tag for Portuguese visitors. For Corean I would add in main domain an alternate tag for visitors in south corea, but not one for visitors in north corea.0 -
Looking for content writers for multi-language SEO
Hi All, I'm currently doing a lot of work for a UK client who has multiple sites outside the UK (all part of the same business). We're currently discussing the option of us handling all of his SEO for his German, French, Spanish and Italian sites too, but we only have access to one person in the office who can speak French and Spanish. They're currently booked up on other jobs that we can't really move them off, so I'm looking for options of outsourcing some of the content writing. My question is, does anyone know of any high quality content writing services that have writers available to write for the countries languages above? We're going to focus initially on their on-site strategy and building up their high quality content. At the moment, they don't have much relevant content on their website, so we're going to initially look at this. Moving forward, we'll be looking at their off-site strategy and trying to find areas to submit high quality articles, look at guest blogging and PR opportunities. Any tips anyone has on this side (in terms of outsourcing to native speakers) would be quite useful too! Many thanks,
International SEO | | PinpointDesigns
Lewis0 -
Google search cache points to and uses content from different url
We have two sites, 1 in new zealand: ecostore.co.nz and 1 in Australia: ecostoreaustralia.com.au Both sites have been assigned with the correct country in Webmaster tools Both site use the same urls structure and content for product and category pages Both sites run off the same server in the US but have unique ip adresses. When I go to google.com.au and search for: site:ecostoreaustralia.com.au I get results which google says are from the Australian domain yet on closer inspection it is actually drawing content from the NZ website. When I view a cached page the URL bar displays the AU domain name but on the page (in the top grey box) it says: _This is Google's cache of http://www.ecostore.co.nz/pages/our-highlights. _ Here is the link to this page: http://webcache.googleusercontent.com/search?q=cache:Zg_CYkqyjP4J:www.ecostoreaustralia.com.au/pages/our-highlights+&cd=7&hl=en&ct=clnk&gl=au In the last four weeks the ranking of the AU website has dropped significantly and the NZ site now ranks first in Google AU, where before the AU site was listed first. Any idea what is going wrong here?
International SEO | | ArchMedia0 -
Will duplicate content across international domains have a negative affect on our SERP
Our corporate website www.tryten.com showcases/sells our products to all of our customers. We have Canadian and UK based customers and would like to duplicate our website onto .ca and .co.uk domains respectively to better service them. These sites will showcase the same products, only the price and ship from locations will change. Also, the phone numbers and contact info will be altered. The sites will all be on one server. On each of the sites there will be a country selector which will take you to the appropriate domain for the country selected. Will doing this negatively affect our rankings in the US, UK and Canada?
International SEO | | tryten0 -
Backlinks that we didnt place, killing our SERP rank and PR
I am in need of advice regarding back links that we did not place, and which are hurting our search engine results. How and why they got there I cannot explain.But they have appeared recently, and are damaging our SERP ranks. For several years, I have been a member with SEOmoz, and we have done our search engine optimization in house. I am the owner of a personal injury law firm, which is a competitive field in search engine optimization. Recently, in the spring, we updated our website, added significant content (over 100 additional pages), we set up a better site structure, and we completed a significant back link campaign from white hat sources. As a result, we were the strongest law firm in search engine results in the state of Arizona, and the page rank from our home page went from a 4 to 6, and from our next highest level of pages, they went from a 3 to 6. This happened in 10 week period. Our search engine results were fantastic. We were getting a significant amount of business from out Places page and our Organic results. That has almost completely dried up. Approximately 6-8 weeks later, we started having some serious problems. Specifically, our search engine results decreased significantly, our page rank reduced from a six to a four. So we started using SEOmoz tools to see what the problem was, and when we created an open site Explorer report, there are approximately 1000 different links from very shady websites they are now linking to our home page. Some of these linking URLs prompt a download to video and other files. Other of the linking are simple on junk sites. Obviously, some other person placed these links. First and foremost I am interested in maintaining the integrity of our site, and if there were a way to remove these links, and protect against that in the future, that is what I want. Secondly, if there were a way to find out who did this, I would like to know that also. What options and/or actions should be taken. I am thinking that I may need to employ a professional/consultant. Will I have to transfer content to another domain? Your thought and help are appreciated. Thanks,
International SEO | | MFC0