Multiple Instances of the Same Article
-
Hi, I'm having a problem I cannot solve about duplicate article postings.
As you will see from the attached images, I have a page with multiple variants of the same URL in google index and as well as duplicate title tag in the search console of webmasters tools. Its been several months I have been using canonical meta tags to resolve the issue, aka declare all variants to point to a single URL, however the problem remains. Its not just old articles that stay like that, even new articles show the same behaviour right when they are published even thought they are presented correctly with canonical links and sitemap as you will see from the example bellow.
Example URLs of the attached Image
-
All URLs belonging to the same article ID, have the same canonical link inside the html head.
-
Also because I have a separate mobile site, I also include in every desktop URL an "alternate" link to the mobile site.
-
At the Mobile Version of the Site, I have another canonical link, pointing back to the original Desktop URL. So the mobile site article version also has
-
Now, when it comes to the xml sitemap, I pass only the canonical URL and none of the other possible variants (to avoid multiple indexing), and I also point to the mobile version of the article.
<url><loc>http://www.neakriti.gr/?page=newsdetail&DocID=1300357</loc>
<xhtml:link rel="alternate" media="only screen and (max-width: 640px)" href="http://mobile.neakriti.gr/fullarticle.php?docid=1300357"><lastmod>2016-02-20T21:44:05Z</lastmod>
<priority>0.6</priority>
<changefreq>monthly</changefreq>
image:imageimage:lochttp://www.neakriti.gr/NewsASSET/neakriti-news-image.aspx?Doc=1300297</image:loc>
image:titleΟΦΗ</image:title></image:image></xhtml:link></url>
The above Sitemap snippet Source: http://www.neakriti.gr/WebServices/sitemap.aspx?&year=2016&month=2
The main sitemap of the website: http://www.neakriti.gr/WebServices/sitemap-index.aspxDespite my efforts you see that webmasters tools reports three variants for the desktop URL, and google search reports 4 URLs (3 different desktop variant urls and the mobile url).
I get this when I type the article code to see if what is indexed in google search: site:neakriti.gr 1300297
So far I believe I have done all I could in order to resolve the issue by addressing canonical links and alternate links, as well as correct sitemap.xml entry. I don't know what else to do... This was done several months ago and there is absolutelly no improvement.
Here is a more recent example of an article added 5 days ago (10-April-2016), just type
site:neakriti.gr 1300357
at google search and you will see the variants of the same article in google cache. Open the google cached page, and you will see the cached pages contain canonical link, but google doesn't obey the direction given there.Please help!
-
-
Hi all,
sorry for the delay, I am away on a business trip, this is why I stopped communicating the past few days.
I can confirm that the latest entries (those after March) come as a single instance.
However there are some minor exceptions like the one hereExample of a recent article indexed in both desktop (even though desktop url is not the canonical) and mobile URL
https://www.google.gr/search?q=site:neakriti.gr&biw=1527&bih=899&source=lnms&sa=X&ved=0ahUKEwiIxODGt5_MAhUsKpoKHdcUAkYQ_AUIBigA&dpr=1.1#q=site:neakriti.gr+1315539&tbs=qdr:w&filter=0Also I noticed that with the "alternate" and "canonical" links the mobile version of the site doesn't get indexed anymore (with minor exceptions like the one above).
-
Hi Ioannis!
How's this going? We'd love an update.
-
Hmm, interestingly, when I followed your link, I only saw the canonical version of the article. Is this what you're seeing now?
Also, in response to your earlier question, yes, you can disallow parameters with robots.txt. If these canonical issues continue, that may be the best next step.
-
Thank you for your response, I will take a look at this.
However I have two questions regarding your suggestion
- Since I have canonical links at the loading page, doesn't that resolve the issue?
- the printerfriendly variation has a noindex meta at the head, shouldn't that be taken into account?
- Can I put regular expressions in my robots.txt? How can I block url params? Because printerfriendly and newsdetailsports are values of the "page" GET param
Infact the printerfriendly contains canonical link and noindex meta to inform search engines not to index content, and let them know where the original content exists
-
Hi there
The printer friendly URL is coming from the print this article button (attached) and the /default.aspx URL is coming from the ^ TOP button (attached).
What you could do is use your robots.txt to ignore these URLs. You can all tell Google what URL parameters to ignore, but please be EXTREMELY careful doing this. It's not a fine comb tool, not a hatchet.
Let me know if you have any questions or comments, good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Optimize A Page For Multiple Keywords
Hi Guys, In this video by Brian Dean he talks about how to go about optimising for multiple keywords. He basically said the main factors for optimising a page for multiple keywords are the following: Identify other keywords with same search intent as your primary keyword. Add them to title tag strategically, don't stuff them in there. Add as many of those keywords as h2 tags into the content, again when it makes sense. Are there any other more advanced ways you can use to optimize a page for multiple keywords with same search intent that could be good? Any suggestions would be very much appreciated! Cheers.
Intermediate & Advanced SEO | | spyaccounts110 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks
Intermediate & Advanced SEO | | NickG-1230 -
Help article / Knowledge base SEO consideration
Hi everyone, I am in the process of building the knowledge base for our SaaS product and I am afraid it could impact us negatively on the SEO side because of: Thin content on pages containing short answers to specific questions Keyword cannibalisation between some of our blog articles and the knowledge base articles I didn't find much on the impact of knowledge bases on SEO when I searched on Google. So I'm hoping we can use this thread to share a few thoughts and best practices on this topic. Below is a bit more details on the issues I face, any tips on how to address them would be most welcome. 1. Thin content: Some articles will have thin content by design: the H1 will be a specific question and there will be only 2 or 3 lines of text answering it in the article. I think creating a dedicated article per question is better than grouping 20 questions on one article from a UX point of view, because this will enable us to direct users more quickly to the answer when they use the live search function inside the software (help widget) or on the knowledge base (saves them the need to scrolling a long article to find the answer). Now the issue is that this will result in lots of pages with thin content. A workaround could be to have both a detailed FAQ style page with all the questions and answers, and individual articles for each question on top of that. The FAQ style page could be indexed in Google while the individual articles would have either a noIndex directive or a rel canonical to the FAQ style page. Have any of you faced similar issues when setting-up your knowledge base? Which approach would you recommend? 2.Keyword cannibalisation: There will be, to some extend, a level of keyword cannibalisation between our blog articles (which rank well) and some of the knowledge base articles. While we want both types of articles to appear in search, we don't want the "How to do XYZ" blog article containing practical tips to compete with the "How to do XYZ in the software" knowledge base article. Do you have any advice on how to achieve that? Having a specific Schema.org (or equivalent) type of markup to differentiate between the 2 types of articles would have been ideal but I couldn't find anything relating to help articles specifically when I searched.
Intermediate & Advanced SEO | | tbps0 -
Noindex / Nofollow multiple reviews pages?
I have well over a hundred pages of reviews (10 per page). I know this is solid content and I'd hate to not be able to leverage it, but I'm running into the issue of having duplicate title tags and H1s on all of the pages. What's the best way to make use of the review content without have those types of issues? Is a noindex / nofollow strategy something I should be considering here for Page 2 and beyond? Thanks! Edit: I did additional digging into pagination strategies and found this terrific article on Moz. I'm thinking it should address my questions regarding review pages as well.
Intermediate & Advanced SEO | | Andrew_Mac0 -
Multiple Keywords for a site
I have a client that is OBSESSED with KWP ranking (don't go there...I know) This client offers multiple services, dog boarding, dog grooming, dog training, dog daycare and dog walking. Essentially these are our focus. She ranks on page one for all of these words (locally of course) BUT she wants to rank in positions 1 and 2 for all of these words. Here's my rub, with her limited budget, we focus on 1 word (and associated long tails like "dog boarding in the south loop) and it takes a couple of months to zoom up to positions 1 or 2 (not counting map pack....she wants ORGANIC) While we're focusing on this 1 word, the others maintain their ranking or slip a few spots (like from 6 to 😎 Conversions average about about 1 a day, organic traffic is roughly 1000 hits a month. In your opinion is it better to split this focus between the 5 target words every month, more slowly building ranking, but maintaining it for longer periods of time. Or do it the way we have been chase dog boarding, then chase training, and so on. It just seems like we are CONSTANTLY chasing something while something else falls. Thanks Tracy
Intermediate & Advanced SEO | | lkilera0 -
Multiple Sitemaps Vs One Sitemap and Why 500 URLs?
I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages. From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps? Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?
Intermediate & Advanced SEO | | Dom4410 -
Silo This! Siloing issue with KW targets and multiple categories
I am having a difficult time determining how to silo the content for this website (douwnpour). The issue I am having is that as I see it there are several different top-level keyword targets to put at the top of the silos, however due to the nature of the products they fit in almost every one of the top-level categories. For instance our main keyword term is "Audio Books" (and derivatives thereof). but we also want to target "Audiobook Downloads" and "Books on CD". Due to the nature of the products, almost every product would fit in all 3 categories. It gets even worse when you consider normal book taxonomy. The normal breakdown would be from audiobooks>Fiction(or Nonfiction). Now each product also belongs to one of these categories, as well as "download", "CD", and "Audiobook". And still worse, our navigation menus link every page on the site back to all of these categories (except audiobooks, as we don't really have a landing page for that besides the home page, which is lacking in optimized content, but is linked from every page on the site.) So, I am finding siloing, or developing a cross-linking plan that makes sense very difficult. It's much easier at the lower levels, but at the top things become muddy. Throw in the idea that we may eventually get e-books as well, and it gets even muddier. I have some ideas of how to deal with some of this, such as having the site navigation put in an i frame, instituting basic breadcrumbs, and building landing pages, but I'm open to any advice or ideas that might help, especially with the top level taxonomy structure. TIA!
Intermediate & Advanced SEO | | DownPour0 -
Multiple stores & domains vs. One unified store (SEO pros / cons for E-Commerce)
Our company runs a number of individual online shops, specialised in particular products but all in the same genre of goods overall, with a specific and relevant domain name for each shop. At the moment the sites are separate, and not interlinked, i.e. Completely separate brands. An analogy could be something like clothing accessories (we are not in the clothing business): scarves.com, and silkties.com (our field is more niche than this) We are about to launch a related site, (e.g. handbags.com), in the same field again but without precisely overlapping products. We will produce this site on a newer, more flexible e-commerce platform, so now is a good time to consider whether we want to place all our sites together with one e-commerce system on the backend. Essentially, we need to know what the pros and cons would be of the various options facing us and how the SEO ranking is affected by the three possibilities. Option 1: continue with separate sites each with its own domains. Option 2: have multiple sites, each on their own domain, but on the same ecommerce system and visible linked together for the customer (with unified checkout) – on the top of each site could be a menu bar linking to each site: [Scarves.com] – [SilkTies.com] – [Handbags.com] The main question here is whether the multiple domains are mutually beneficial, particularly considerding how close to target keywords the individual domains are. If mutually benefitial, how does it compare to option 3: Option 3: Having recently acquired a domain name (e.g. accessories.com) which would cover the whole category together, we are presented with a third option: making one site selling all of these products in different categories. Our main concern here would be losing the ability to specifically target marketing, and losing the benefit of the domains with the key words in for what people are more likely to be searching for (e.g. 'silk tie') rather than 'accessories.' Is it worth taking the hit on losing these specific targeted domain names for the advantage of increased combined inbound links?
Intermediate & Advanced SEO | | Colage0