Multiple Instances of the Same Article
-
Hi, I'm having a problem I cannot solve about duplicate article postings.
As you will see from the attached images, I have a page with multiple variants of the same URL in google index and as well as duplicate title tag in the search console of webmasters tools. Its been several months I have been using canonical meta tags to resolve the issue, aka declare all variants to point to a single URL, however the problem remains. Its not just old articles that stay like that, even new articles show the same behaviour right when they are published even thought they are presented correctly with canonical links and sitemap as you will see from the example bellow.
Example URLs of the attached Image
-
All URLs belonging to the same article ID, have the same canonical link inside the html head.
-
Also because I have a separate mobile site, I also include in every desktop URL an "alternate" link to the mobile site.
-
At the Mobile Version of the Site, I have another canonical link, pointing back to the original Desktop URL. So the mobile site article version also has
-
Now, when it comes to the xml sitemap, I pass only the canonical URL and none of the other possible variants (to avoid multiple indexing), and I also point to the mobile version of the article.
<url><loc>http://www.neakriti.gr/?page=newsdetail&DocID=1300357</loc>
<xhtml:link rel="alternate" media="only screen and (max-width: 640px)" href="http://mobile.neakriti.gr/fullarticle.php?docid=1300357"><lastmod>2016-02-20T21:44:05Z</lastmod>
<priority>0.6</priority>
<changefreq>monthly</changefreq>
image:imageimage:lochttp://www.neakriti.gr/NewsASSET/neakriti-news-image.aspx?Doc=1300297</image:loc>
image:titleΟΦΗ</image:title></image:image></xhtml:link></url>
The above Sitemap snippet Source: http://www.neakriti.gr/WebServices/sitemap.aspx?&year=2016&month=2
The main sitemap of the website: http://www.neakriti.gr/WebServices/sitemap-index.aspxDespite my efforts you see that webmasters tools reports three variants for the desktop URL, and google search reports 4 URLs (3 different desktop variant urls and the mobile url).
I get this when I type the article code to see if what is indexed in google search: site:neakriti.gr 1300297
So far I believe I have done all I could in order to resolve the issue by addressing canonical links and alternate links, as well as correct sitemap.xml entry. I don't know what else to do... This was done several months ago and there is absolutelly no improvement.
Here is a more recent example of an article added 5 days ago (10-April-2016), just type
site:neakriti.gr 1300357
at google search and you will see the variants of the same article in google cache. Open the google cached page, and you will see the cached pages contain canonical link, but google doesn't obey the direction given there.Please help!
-
-
Hi all,
sorry for the delay, I am away on a business trip, this is why I stopped communicating the past few days.
I can confirm that the latest entries (those after March) come as a single instance.
However there are some minor exceptions like the one hereExample of a recent article indexed in both desktop (even though desktop url is not the canonical) and mobile URL
https://www.google.gr/search?q=site:neakriti.gr&biw=1527&bih=899&source=lnms&sa=X&ved=0ahUKEwiIxODGt5_MAhUsKpoKHdcUAkYQ_AUIBigA&dpr=1.1#q=site:neakriti.gr+1315539&tbs=qdr:w&filter=0Also I noticed that with the "alternate" and "canonical" links the mobile version of the site doesn't get indexed anymore (with minor exceptions like the one above).
-
Hi Ioannis!
How's this going? We'd love an update.
-
Hmm, interestingly, when I followed your link, I only saw the canonical version of the article. Is this what you're seeing now?
Also, in response to your earlier question, yes, you can disallow parameters with robots.txt. If these canonical issues continue, that may be the best next step.
-
Thank you for your response, I will take a look at this.
However I have two questions regarding your suggestion
- Since I have canonical links at the loading page, doesn't that resolve the issue?
- the printerfriendly variation has a noindex meta at the head, shouldn't that be taken into account?
- Can I put regular expressions in my robots.txt? How can I block url params? Because printerfriendly and newsdetailsports are values of the "page" GET param
Infact the printerfriendly contains canonical link and noindex meta to inform search engines not to index content, and let them know where the original content exists
-
Hi there
The printer friendly URL is coming from the print this article button (attached) and the /default.aspx URL is coming from the ^ TOP button (attached).
What you could do is use your robots.txt to ignore these URLs. You can all tell Google what URL parameters to ignore, but please be EXTREMELY careful doing this. It's not a fine comb tool, not a hatchet.
Let me know if you have any questions or comments, good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple H2 tags
Is it advisable to use only one H2 tag? The template designs for some reason is ended up with multiple H2 tags, I realise if any think it's that each one is that are important and it is all relative. Just trying to assess if it's worth the time and effort to rehash the template. Has anyone done any testing or got any experience? Thanks
Intermediate & Advanced SEO | | seoman101 -
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Articles marked with "This site may be hacked," but I have no security issues in the search console. What do I do?
There are a number of blog articles on my site that have started receiving the "This site may be hacked" warning in the SERP. I went hunting for security issues in the Search Console, but it indicated that my site is clean. In fact, the average position of some of the articles has increased over the last few weeks while the warning has been in place. The problem sounds very similar to this thread: https://productforums.google.com/forum/#!category-topic/webmasters/malware--hacked-sites/wmG4vEcr_l0 but that thread hasn't been touched since February. I'm fearful that the Google Form is no longer monitored. What other steps should I take? One query where I see the warning is "Brand Saturation" and this is the page that has the warning: http://brolik.com/blog/should-you-strive-for-brand-saturation-in-your-marketing-plan/
Intermediate & Advanced SEO | | Liggins0 -
How to avoid duplicate content with e-commerce and multiple stores?
We are currently developing an e-commerce platform that will feed multiple stores. Each store will have its own domain and URL, but all stores will offer products that come from the same centralized database. That means all products will have the same image, description and title across all stores. What would be the best practice to avoid getting stores penalized for duplicate content?
Intermediate & Advanced SEO | | Agence_Bunji0 -
Interlinking sites in multiple languages
I am working on a project where the client has a main .com site and the following additional sites which are all interlinked: .com site targeting US
Intermediate & Advanced SEO | | rachelmanning888
.com site targeting China
.HK site targeting Hong Kong All sites contain similar information (although the Chinese site is translated). They are not identical copies but being shopping sites, they contain a lot of similar product information. Webmeup software (now defunct) showed that the inbound links to the main site, from the additional domains are considered risky. Linkrisk shows them as neutral. The client wants them to be interlinked and would not want to remove the additional domains as they get a good amount of traffic. In addition, the messages and products for each country domain have been tailored to a degree to suit that audience. We can rewrite the content on the other domains, but obviously this is a big job. Can anyone advise if this would be causing a problem SEO wise and if so, is the best way to resolve it to rewrite the content on the US and Hong Kong sites? Alternatively would it be better to integrate the whole lot together (they will soon be rebuilding the main site, so it would be an appropriate time to do this).0 -
Should I literally delete all the articles I published in 2010/2011?
We became a charity in December and redirected everything from resistattack.com to resistattack.org. Both sites weren't up at the same time, we just switched over. However, GWT still shows the .com as a major backlinker to the .org. Why? More importantly, our site just got hit for the first time by an "unnatural link" penalty according to GWT. Our traffic dropped 70% overnight. This appeared shortly after a friend posted a sidewide link from his site that suddenly sent 10,000 links to us. I figured that was the problem, so I asked him to remove the links (he has) and submitted a reconsideration request. Two weeks later, Google refused, saying.. "We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes." We haven't done any "SEO link building" for two years now, but we used to publish a lot of articles to ezinearticles and isnare back in 2010/2011. They were picked up and linked from hundreds of spammy sites of course, none of which we had anything to do with. They are still being taken and new backlinks created. I just downloaded GWT latest backlinks and it's a nightmare of crappy article sites. Should I delete everything from EZA/isnare and close my account? Or just wait longer for the 10,000 links to be crawled and removed from my friends site? What do I need to do about the spammy article sites? Disavow tool or just ignore them? Any other tips/tricks?
Intermediate & Advanced SEO | | TellThemEverything0 -
Multiple cities/regions websites - duplicate content?
We're about to launch a second site for a different, neighbouring city in which we are going to setup a marketing campaign to target sales in that city (which will also have a separate office there as well). We are going to have it under the same company name, but different domain name and we're going to do our best to re-write the text content as much as possible. We want to avoid Google seeing this as a duplicate site in any way, but what about: the business name the toll free number (which we would like to have same on both sites) the graphics/image files (which we would like to have the same on both sites) site structure, coding styles, other "forensic" items anything I might not be thinking of... How are we best to proceed with this? What about cross-linking the sites?
Intermediate & Advanced SEO | | webdesignbarrie0 -
One platform, multiple niche sites: Worth $60/mo so each site has different class C?
Howdy all, The short of it is that I currently run a very niche business directory/review website and am in the process of expanding the system to support running multiple sites out of the same database/codebase. In a normal setup I'd just run all the sites off of the same server with all of them sharing a single IP address, but thanks to the wonders of the cloud, it would be fairly simple for me to run each site on it's own server at a cost of about $60/mo/site giving each site a unique IP on a unique c-block (in many cases a unique a-block even.) The ultimate goal here is to leverage the authority I've built up for the one site I currently run to help grow the next site I launch, and repeat the process. The question is: Is the SEO-value that the sites can pass to each other worth the extra cost and management overhead? I've gotten conflicting answers on this topic from multiple people I consider pretty smart so I'd love to know what other people say.
Intermediate & Advanced SEO | | qurve0