Fresh content has had a negative affect on SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Howdy! Just wanted to point out this question is several months old.
The statement about "The same content in another language can trigger duplicate content issues" was a bit surprising to hear. Can you provide some more information about that?
I'm more accustomed to what Google says in places like http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#3 where they state "Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag."
Matt Cutts also talks about it being OK in a video at http://www.youtube.com/watch?v=UDg2AGRGjLQ.
I'm also interested in knowing more about content needing to be relative to the directory name. Can you give a few more details?
-
Hello, You will not rank well on G without backlinks, sometimes new sites do get a temp boost early in their life because of the fresh content. The same content in another language can trigger duplicate content issues, also look into your directory/url addresses because the content should be relative to the directory name etc. Hope you figure it out, if worse comes to worse you can also roll back your changes and observe ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
Hreflang tags and canonical tags - might be causing indexing and duplicate content issues
Hi, Let's say I have a site located at https://www.example.com, and also have subdirectories setup for different languages. For example: https://www.example.com/es_ES/ https://www.example.com/fr_FR/ https://www.example.com/it_IT/ My Spanish version currently has the following hreflang tags and canonical tag implemented: My robots.txt file is blocking all of my language subdirectories. For example: User-agent:* Disallow: /es_ES/ Disallow: /fr_FR/ Disallow: /it_IT/ This setup doesn't seem right. I don't think I should be blocking the language-specific subdirectories via robots.txt What are your thoughts? Does my hreflang tag and canonical tag implementation look correct to you? Should I be doing this differently? I would greatly appreciate your feedback and/or suggestions.
International SEO | | Avid_Demand0 -
Geolocation issue: Google not displaying the correct url in the SERP's
Hello, Im running a multi-country domain with this structure: domain.com/ar/
International SEO | | EstebanCervi
domain.com/mx/
domain.com/cl/
etc I also have: domain.com/int/ for x-default
domain.com/category/ does a 301 redirect through IP geo-location to the correspondent url, example if your IP is from Mexico, then you got redirected to domain.com/mx/category/ hreflang is correct. webmaster tool geo-location is correct. Example of the issue Im facing right now: When users from Chile do a keyword search in Google Chile, the domain ranks well but the URL that appears in the SERP is the /mx/ version, or the /int/ version or any other country version. Other times is the /cl/ version. The same happens for all the users / countries / keywords. I need to understand what Im doing wrong, because Google is not displaying in the SERP's the correct URL version for the country of the user who is doing the search. Thank you so much! I will appreciate your ideas. PS: I think I should try to change the 301 to a 302 redirect, or completely remove those redirects. Any ideas? Suggestions? Thanks!0 -
International SEO & Duplicate Content: ccTLD, hreflang, and relcanonical tags
Hi Everyone, I have a client that has two sites (example.com & example.co.uk) each have the same English content, but no hreflang or rel="canonical" tags in place. Would this be interpreted as duplicate content? They haven't changed the copy to speak to specific regions, but have tried targeting the UK with a ccTLD. I've taken a look at some other comparable question on MOZ like this post - > https://moz.com/community/q/international-hreflang-will-this-handle-duplicate-content where one of the answers says **"If no translation is happening within a geo-targeted site, HREFLANG is not necessary." **If hreflang tags are not necessary, then would I need rel="canonical" to avoid duplicate content? Thanks for taking the time to help a fellow SEO out.
International SEO | | ccox10 -
Duplicate Page Content due to Language and Currency
Hi Folks, hoping someone can help me out please I have a site that I'd like to rank in France and the UK but I'm getting a stack of duplicate content errors due to English and French pages and GBP and EUR prices. Below is an example of how the home page is duplicated: http://www.site.com/?sl=en?sl=fr
International SEO | | Marketing_Today
http://www.site.com/?sl=fr?sl=fr
http://www.site.com
http://www.site.com/?currency=GBP?sl=fr
http://www.site.com/?currency=GBP?sl=en
http://www.site.com/?sl=fr?sl=en
http://www.site.com/?currency=EUR?sl=fr
http://www.site.com/?currency=EUR?sl=en
http://www.site.com/?currency=EUR
http://www.site.com/?sl=en¤cy=EUR
http://www.site.com/?sl=en¤cy=GBP
http://www.site.com/?sl=en
http://www.site.com/?currency=GBP
http://www.site.com/?sl=en?sl=en Each page has the following code in the that updates according to the page you are on: How do I simplify this and what's the correct approach?0 -
Impact of Japanese .jp site duplicate content?
Our main website is at http://www.traxnyc.com and we just launched a Japanese version of the site at http://www.traxnyc.jp domain. However all the images used on the .jp site are linked from the .com site. Would this hurt me in Google at all for hotlinking images? Also there is quite a bit of duplicate content on the .jp site at the moment: only a few things have been translated to Japanese and for the most part the layouts and words are exactly the same (in English). Would this hurt my Google rankings in the US at all? Thanks for all your help.
International SEO | | DiamondJewelryEmpire0 -
Does penguin update affect all sub-domains?
A UK sub-domain of a big US site got hit by Penguin last week. The two operations are completely separate apart from sharing a parent domain. The US site also run a multitude of other sub-domains in the same marketplace. Their link profile is not squeaky clean. The question is, could the actions of the US site, either in bad links, or poor on-site issues, have caused Penguin to hit the UK sub-domain? Unfortunately I have no access to the US Analytics or rankings data to know if they were hit by Penguin too. Thanks
International SEO | | BeattieGroup0 -
Is duplicate content really an issue on different International Google engines?
i.e. Google.com v.s. Google.co.uk This relates to another question I have open on a similar issue. So if I open the same e-commerce site (virtually) on company.com and company.co.uk, does Google really view that as duplicate content? I would be inclined to think they have that figured out but I havent had much experience with international SEO...
International SEO | | BlinkWeb0