Do we have any risk or penalty for double canonicals?
-
Hi all,
We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C?
Thanks
-
Yes! I read the example backward. I'm with you! All pages should point to C.
-
Hi vtmoz.
I think Steve made a typo, saying to point all back to A.
My opinion here is:
- Avoid at any cost these canonical chains. They are messy to Google and it may get GoogleBot to reduce the importance of your pages.
- There is no risk of any know penalty. Google probably will not tell you in Search Console that you have a penalty for several canonicals.
- Point page A to C and page B to C.
Hope it helps.
Best Luck.
GR. -
I'd have them all pointing back to C so it's a little easier to manage long term. G just updated some of their docs related to canonical URL use cases with some great examples. From this page:
You can use a tag in the page header to indicate when a page is a duplicate of another page.
Suppose you want
https://example.com/dresses/green-dresses
to be the canonical URL, even though a variety of URLs can access this content. Indicate this URL as canonical with these steps:-
Mark all duplicate pages with a rel="canonical" link element. Add a element with the attribute
rel="canonical"
to the section of duplicate pages, pointing to the canonical page, like this one: -
If the canonical page has a mobile variant, add a
rel="alternate"
link to it, pointing to the mobile version of the page: -
Add any hreflang or other redirects appropriate for the page.
They don't touch on the chain of canonical URLs you suggest but I'd have them all pointing to C since it's a scalable change.
[edit: updated to match example in OP]
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content is not very similar but topic is same: Will Google considers the rel canonical tags?
Hi Moz community, We have multiple pages from our own different sub-domains for same topics. These pages even rank in SERP for related keywords. Now we are planning to show only one of the pages in SERP. We cannot redirect unfortunately. We are planning to use rel canonical tags. But the page content is not same, only 20% is similar and 80% is different but the context is same. If we use rel canonicals, does Google accepts this? If not what should I do? Making header tags similar works? How Google responds if content is not matching? Just ignore or any negative score? Thanks
Algorithm Updates | | vtmoz0 -
Any risks involved in redirecting low quality Infringement website?
Hi all, Recently we have over taken one of the websites (with Trademark Infringement )who been using our domain name in their domain. That website got no traffic or backlinks. Is there any risk involved in redirecting that website to our website? Thanks
Algorithm Updates | | vtmoz0 -
Canonical URLs being ignored?
Hi Guys, Has anybody noticed canonical URLs being ignored where they were previously obeyed? I have a site that is doing this at the moment and just wondered if this was being seen elsewhere and if anyone knows what the solution is? Thanks, Elias
Algorithm Updates | | A_Q0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Canonical URl
Hello, All the pages of my site contained canonical url it shows me in the source, but on seomoz site it shows error that some the pages not containing canonical urls, anyone will help me ??
Algorithm Updates | | KLLC0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Forum software penalties
I'm hoping to solicit some feedback on what people feel would be SEO best practices for message board/forum software. Specifically, while message boards that are healthy can generate tons of unique content, they also can generate a fair share of thin content pages. These pages include... Calendar pages that can have a page for each day of each month for 10 years! (thats like 3650 pages of just links). User Profile pages, which depending on your setup can tend to be thin. The board I work with has 20k registered members, hence 20k user profile pages. User lists which can have several hundred pages. I believe Google is pretty good at understanding what is message board content, but there is still a good chance that one could be penalized for these harmless pages. Do people feel that the above pages should be noindexed? Another issue is that of unrelated content. Many forums have their off-topic areas (the Pub or Hangout or whatever). On our forum up to 40% of the content is off-topic (when I say content I mean number of post versus raw word count). What are the advantages and disadvantages of such content? On one hand they expand the keywords you can rank for. On the other hand it might generate google organic traffic which you might now want because of a high bounce rate. Does too much indexable content that is unique dilute your good content?
Algorithm Updates | | entropytc1