Fresh content has had a negative affect on SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Howdy! Just wanted to point out this question is several months old.
The statement about "The same content in another language can trigger duplicate content issues" was a bit surprising to hear. Can you provide some more information about that?
I'm more accustomed to what Google says in places like http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#3 where they state "Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag."
Matt Cutts also talks about it being OK in a video at http://www.youtube.com/watch?v=UDg2AGRGjLQ.
I'm also interested in knowing more about content needing to be relative to the directory name. Can you give a few more details?
-
Hello, You will not rank well on G without backlinks, sometimes new sites do get a temp boost early in their life because of the fresh content. The same content in another language can trigger duplicate content issues, also look into your directory/url addresses because the content should be relative to the directory name etc. Hope you figure it out, if worse comes to worse you can also roll back your changes and observe ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Important pages are being 302 redirected, then 301 redirected to support language versions. Is this affecting negatively the linking juice distribution of our domain?
Hi mozzers, Prior to my arrival, in order to support and better serve the international locations and offering multiple language versions of the same content the company decided to restructure its URLs focused on locale urls. We went from
International SEO | | Ty1986
https://example.com/subfolder to https://example.com/us/en-us/new-subfolder (US)
https://example.com/ca/en-us/new-subfolder (CAN)
https://example.com/ca/fr-ca/new-subfolder (CAN)
https://example.com/de/en-us/new-subfolder (Ger)
https://example.com/de/de-de/new-subfolder (Ger) This had implications on redirecting old URLs to new ones. All important URLs such as https://example.com/subfolder were
302 redirected to https://example.com/us/en-us/subfolder and then 301 redirected to the final URL. According to the devs: If you change the translation to the page or locale, then a 302 needs to happen so you see the same version of the page in German or French, then a 301 redirect happens from the legacy URL to the new version. If the 302 redirect was skipped, then you would only be able to one version/language of that page.
For instance:
http://example.com/subfolder/state/city --> 301 redirect to {LEGACY URL]
https://example.com/subfolder/state/city --> 302 redirect to
https://example.com/en-us/subfolder/state/city --> 301 redirect to
https://example.com/us/en-us/new-subfolder/city-state [NEW URL] I am wondering if these 302s are hurting our link juice distribution or that is completely fine since they all end up as a 301 redirect? Thanks.1 -
Geolocation issue: Google not displaying the correct url in the SERP's
Hello, Im running a multi-country domain with this structure: domain.com/ar/
International SEO | | EstebanCervi
domain.com/mx/
domain.com/cl/
etc I also have: domain.com/int/ for x-default
domain.com/category/ does a 301 redirect through IP geo-location to the correspondent url, example if your IP is from Mexico, then you got redirected to domain.com/mx/category/ hreflang is correct. webmaster tool geo-location is correct. Example of the issue Im facing right now: When users from Chile do a keyword search in Google Chile, the domain ranks well but the URL that appears in the SERP is the /mx/ version, or the /int/ version or any other country version. Other times is the /cl/ version. The same happens for all the users / countries / keywords. I need to understand what Im doing wrong, because Google is not displaying in the SERP's the correct URL version for the country of the user who is doing the search. Thank you so much! I will appreciate your ideas. PS: I think I should try to change the 301 to a 302 redirect, or completely remove those redirects. Any ideas? Suggestions? Thanks!0 -
Duplicate content on multistore Magento website
Hello there, We run a Magento based e-commerce site in the UK for example: domain.com We are looking to launch a USA and Australian version of the website: usa.domain.com and au.domain.com Obviously the currency will be different and so will some of the content. Will we be penalised for having duplicate content across these 3 sites? (As some pages will be very similar or the same) Thanks Robert
International SEO | | roberthseo0 -
Is having duplicated content on different domains a problem when using alternate tag, but no canonical?
We will be launching a couple of new language versions. I understand that ccTLD is mostly considered as best option, however I thought that to start with it might be better to launch the new language version first on a subdirectory of our established domain with strong backlink profile as it may rank much better until I can attract some strong links to new ccTLD. I would wait for the pages of new language versions to be indexed on the main domain and then after a month launch the same content paralell on the ccTLD setting up an alternate tag in the main domain pointing to the ccTLD. I would not setup any canonical tag. As I understand google would rank whatever of the 2 versions ranks higher. Should not cause duplicated content issues right?
International SEO | | lcourse
Any thoughts? EDIT:
For clarification. The language we are launching are mostly spoken in several countries. E.g. for Portuguese I would add in main domain an altnernate tag for Brazilian visitors to Brazilian ccTLD, but no alternate tag for Portuguese visitors. For Corean I would add in main domain an alternate tag for visitors in south corea, but not one for visitors in north corea.0 -
Impact of Japanese .jp site duplicate content?
Our main website is at http://www.traxnyc.com and we just launched a Japanese version of the site at http://www.traxnyc.jp domain. However all the images used on the .jp site are linked from the .com site. Would this hurt me in Google at all for hotlinking images? Also there is quite a bit of duplicate content on the .jp site at the moment: only a few things have been translated to Japanese and for the most part the layouts and words are exactly the same (in English). Would this hurt my Google rankings in the US at all? Thanks for all your help.
International SEO | | DiamondJewelryEmpire0 -
Google search cache points to and uses content from different url
We have two sites, 1 in new zealand: ecostore.co.nz and 1 in Australia: ecostoreaustralia.com.au Both sites have been assigned with the correct country in Webmaster tools Both site use the same urls structure and content for product and category pages Both sites run off the same server in the US but have unique ip adresses. When I go to google.com.au and search for: site:ecostoreaustralia.com.au I get results which google says are from the Australian domain yet on closer inspection it is actually drawing content from the NZ website. When I view a cached page the URL bar displays the AU domain name but on the page (in the top grey box) it says: _This is Google's cache of http://www.ecostore.co.nz/pages/our-highlights. _ Here is the link to this page: http://webcache.googleusercontent.com/search?q=cache:Zg_CYkqyjP4J:www.ecostoreaustralia.com.au/pages/our-highlights+&cd=7&hl=en&ct=clnk&gl=au In the last four weeks the ranking of the AU website has dropped significantly and the NZ site now ranks first in Google AU, where before the AU site was listed first. Any idea what is going wrong here?
International SEO | | ArchMedia0 -
Does IP filtering have a negative impact on SEO?
If a large site has multiple regions (Australia, USA, UK, France), how will IP filtering to a particular area affect SEO. e.g: Ilive in the UK an if I visit the said website I would automatically be redirected to the UK subfolder of the site whereas somebody searching in Australia would be redirected to the AUS folder. Will there be any detrimental affect on SEO and will the search engines still be able to crawl the entire site no matter which data centre is being used?
International SEO | | White.net0 -
Internationally targetted subdomains and Duplicate content
A client has a site they'd like to translated into French, not for the french market but for french speaking countries. My research tells me the best way to implement this for this particular client is to create subfolders for each country. For ease of implementation I’ve decided against ccTLD’s and Sub Domains. So for example… I'll create www.website.com/mr/ for Mauritania and in GWT set this to target Mauritania. Excellent so far. But then I need to build another sub folder for Morocco. I'll then create www.website.com/ma/ for Morocco and in GWT set this to target Morocco. Now the content on these two sub folders will be exactly the same and I’m thinking about doing this for all French speaking African countries. It would be nice to use www.website.com/fr/ but in GWT you can only set one Target country. Duplicate content issues arise and my fear of perturbing the almighty Google becomes a possibility. My research indicates that I should simply canonical back to the page I want indexed. But I want them both to be indexed surely!? I therefore decided to share my situation with my fellow SEO’s to see if I’m being stupid or missing something simple both a distinct possibility!
International SEO | | eazytiger0