Multiple Instances of the Same Article
-
Hi, I'm having a problem I cannot solve about duplicate article postings.
As you will see from the attached images, I have a page with multiple variants of the same URL in google index and as well as duplicate title tag in the search console of webmasters tools. Its been several months I have been using canonical meta tags to resolve the issue, aka declare all variants to point to a single URL, however the problem remains. Its not just old articles that stay like that, even new articles show the same behaviour right when they are published even thought they are presented correctly with canonical links and sitemap as you will see from the example bellow.
Example URLs of the attached Image
-
All URLs belonging to the same article ID, have the same canonical link inside the html head.
-
Also because I have a separate mobile site, I also include in every desktop URL an "alternate" link to the mobile site.
-
At the Mobile Version of the Site, I have another canonical link, pointing back to the original Desktop URL. So the mobile site article version also has
-
Now, when it comes to the xml sitemap, I pass only the canonical URL and none of the other possible variants (to avoid multiple indexing), and I also point to the mobile version of the article.
<url><loc>http://www.neakriti.gr/?page=newsdetail&DocID=1300357</loc>
<xhtml:link rel="alternate" media="only screen and (max-width: 640px)" href="http://mobile.neakriti.gr/fullarticle.php?docid=1300357"><lastmod>2016-02-20T21:44:05Z</lastmod>
<priority>0.6</priority>
<changefreq>monthly</changefreq>
image:imageimage:lochttp://www.neakriti.gr/NewsASSET/neakriti-news-image.aspx?Doc=1300297</image:loc>
image:titleΟΦΗ</image:title></image:image></xhtml:link></url>
The above Sitemap snippet Source: http://www.neakriti.gr/WebServices/sitemap.aspx?&year=2016&month=2
The main sitemap of the website: http://www.neakriti.gr/WebServices/sitemap-index.aspxDespite my efforts you see that webmasters tools reports three variants for the desktop URL, and google search reports 4 URLs (3 different desktop variant urls and the mobile url).
I get this when I type the article code to see if what is indexed in google search: site:neakriti.gr 1300297
So far I believe I have done all I could in order to resolve the issue by addressing canonical links and alternate links, as well as correct sitemap.xml entry. I don't know what else to do... This was done several months ago and there is absolutelly no improvement.
Here is a more recent example of an article added 5 days ago (10-April-2016), just type
site:neakriti.gr 1300357
at google search and you will see the variants of the same article in google cache. Open the google cached page, and you will see the cached pages contain canonical link, but google doesn't obey the direction given there.Please help!
-
-
Hi all,
sorry for the delay, I am away on a business trip, this is why I stopped communicating the past few days.
I can confirm that the latest entries (those after March) come as a single instance.
However there are some minor exceptions like the one hereExample of a recent article indexed in both desktop (even though desktop url is not the canonical) and mobile URL
https://www.google.gr/search?q=site:neakriti.gr&biw=1527&bih=899&source=lnms&sa=X&ved=0ahUKEwiIxODGt5_MAhUsKpoKHdcUAkYQ_AUIBigA&dpr=1.1#q=site:neakriti.gr+1315539&tbs=qdr:w&filter=0Also I noticed that with the "alternate" and "canonical" links the mobile version of the site doesn't get indexed anymore (with minor exceptions like the one above).
-
Hi Ioannis!
How's this going? We'd love an update.
-
Hmm, interestingly, when I followed your link, I only saw the canonical version of the article. Is this what you're seeing now?
Also, in response to your earlier question, yes, you can disallow parameters with robots.txt. If these canonical issues continue, that may be the best next step.
-
Thank you for your response, I will take a look at this.
However I have two questions regarding your suggestion
- Since I have canonical links at the loading page, doesn't that resolve the issue?
- the printerfriendly variation has a noindex meta at the head, shouldn't that be taken into account?
- Can I put regular expressions in my robots.txt? How can I block url params? Because printerfriendly and newsdetailsports are values of the "page" GET param
Infact the printerfriendly contains canonical link and noindex meta to inform search engines not to index content, and let them know where the original content exists
-
Hi there
The printer friendly URL is coming from the print this article button (attached) and the /default.aspx URL is coming from the ^ TOP button (attached).
What you could do is use your robots.txt to ignore these URLs. You can all tell Google what URL parameters to ignore, but please be EXTREMELY careful doing this. It's not a fine comb tool, not a hatchet.
Let me know if you have any questions or comments, good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are safe promotion techniques for articles targeting low competition keywords on a high authority site?
Hi SEO Community, The title says it all; we are running a content strategy that is targeting relevant low volume, low competition keywords published on a high authority domain. How would you design your promotion / reachout / linkbuilding / strategy in this context? Would you assume internal linking would do the job or are there easy wins to earn rankings in this low competition environment? /T
Intermediate & Advanced SEO | | ppseo800 -
Noindex / Nofollow multiple reviews pages?
I have well over a hundred pages of reviews (10 per page). I know this is solid content and I'd hate to not be able to leverage it, but I'm running into the issue of having duplicate title tags and H1s on all of the pages. What's the best way to make use of the review content without have those types of issues? Is a noindex / nofollow strategy something I should be considering here for Page 2 and beyond? Thanks! Edit: I did additional digging into pagination strategies and found this terrific article on Moz. I'm thinking it should address my questions regarding review pages as well.
Intermediate & Advanced SEO | | Andrew_Mac0 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
One website, multiple service points
Hi all Wondering if anyone could offer an opinion on this..am talking to a new client who offer kitchen installation and design. They have a central headquarters and cover a 100 mile radius of their location. A lot of search terms they are aiming to target - Kitchen Design, Kitchen Fitters etc offer localised results. This is where my issue lays. I have worked with plenty of clients in the past which have physical presence in multiple locations and have marked up the site so that the site ranks for each of the stores, but trying to make one site appear in many locations where it doesn't have an address is a different issue completely. Not only do they only have one address, they also only have one phone number. We will target, as best we can, the non localised keywords but need to work out what to do to cover the locations 20/30/40 miles from the office which they cover. I welcome any opinions on this please.
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Multiple domain level redirects to unique sub-folder on one domain...
Hi, I have a restaurant menu directory listing website (for example www.menus.com). Restaurant can have there menu listed on this site along with other details such as opening hours, photos ect. An example of a restaurant url might be www.menus.com/london/bobs-pizza. A feature i would like to offer is the ability for Bob's pizza to use the menus.com website listing as his own website (let assume he has no website currently). I would like to purchase www.bobspizza.com and 301 redirect to www.menus.com/london/bobs-pizza Why?
Intermediate & Advanced SEO | | blackrails
So bob can then list bobspizza.com on his advertising material (business cards etc, rather than www.menus.com/london/bobs-pizza). I was considering using a 301 redirect for this though have been told that too many domain level redirects to one single domain can be flagged as spam by Google. Is there any other way to achieve this outcome without being penalised? Rel canonical url, url masking? Other things to note: It is fine if www.bobspizza.com is NOT listed in search results. I would ideally like any link juice pointing to www.bobspizza.com to pass onto www.menus.com though this is a nice to have. If it comes at the cost of being penalised i can live without the link juice from this. Thanks0 -
Local SEO Plus Performance Based Pay Per Call Multiple Phone Numbers
In my learning about Local SEO recently, I keep reading the importance of NAP (name, address, phone number). But what if you are only using different phone numbers because you are tracking pay per call. How would set up my Local SEO strategy? The newest phone numbers are NOT going to match all the websites, social media and previous listing in directories, etc. Is this a bad move? Should I suggest that we do one or the other going forward with other clients but not both? Thanks a lot...
Intermediate & Advanced SEO | | greenhornet770 -
Google Re-Index or multiple 301 Redirects on the server?
Over a year ago we moved a site from Blogspot that was adding dates in the URL's (i.e.. blog/2012/08/10/) Additionally we've removed category folders (/category, /tag, etc). Overall if I add all these redirects (from the multiple date options, etc) I'm concerned it might be an overload on the server? After talking with the server team they had suggested using something like 'BWP Google Sitemaps' on our Wordpress site, which would allow Google some time to re-index our site. What do you suggest we do?
Intermediate & Advanced SEO | | seointern0