Is there a limit to the number of duplicate pages pointing to a rel='canonical ' primary?
-
We have a situation on twiends where a number of our 'dead' user pages have generated links for us over the years. Our options are to 404 them, 301 them to the home page, or just serve back the home page with a canonical tag.
We've been 404'ing them for years, but i understand that we lose all the link juice from doing this. Correct me if I'm wrong?
Our next plan would be to 301 them to the home page. Probably the best solution but our concern is if a user page is only temporarily down (under review, etc) it could be permanently removed from the index, or at least cached for a very long time.
A final plan is to just serve back the home page on the old URL, with a canonical tag pointing to the home page URL. This is quick, retains most of the link juice, and allows the URL to become active again in future. The problem is that there could be 100,000's of these.
Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?)
Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up?
Thanks
-
I'll add this article by Rand that I came across too. I'm busy testing the solution presented in it:
https://moz.com/blog/are-404-pages-always-bad-for-seo
In summary, 404 all dead pages with a good custom 404 page so as to not waste crawl bandwidth. Then selectively 301 those dead pages that have accrued some good link value.
Thanks Donna/Tammy for pointing me in this direction..
-
In this scenario yes, a customized 404 page with a link to a few top level ( useful) links would be better served to both the user and to Google. From a strictly SEO standpoint, 100,000 redirects and or canonical tags would not benefit your SEO.
-
Thanks Donna, good points..
We return a hard 404, so it's treated correctly by google. We are just looking at this from a SEO point of view now to see if there's any way to reclaim this lost link juice.
Your point about looking at the value of those incoming links is a good one. I suppose it's not worth making google crawl 100,000 more pages for the sake of a few links. We've just starting seeing these pop up in Moz Analytics as link opportunities, and we can see them as 404's in site explorer too. There are a few hundred of these incoming links that point to a 404, so we feel this could have an impact.
I suppose we could selectively 301 any higher value links to the home page.. It will be an administrative nightmare, but doable..
How do others tackle this problem. Does everyone just hard 404 a page when that loses the link juice for incoming links to it..?
Thanks
-
Hi David,
When you say "we've been 404'ing them for years", does that mean you've created a custom 404 page that explains the situation to site visitors or does it mean you've been letting them naturally error and return the appropriate 404 (page not found) error to Google? It makes a difference. If the pages truly no longer exist and there is no equivalent replacement, you should be letting them naturally error (return a 404 return code) so as not to mislead Google's robots and site visitors.
Have you looked at the value of those incoming links? They may be low value anyway. There may be more valuable things you could be doing with your time and budget.
To answer your specific questions:
_Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?) _
Yes, if those pages (or valuable replacements) don't actually exist. You'd be wasting valuable crawl budget. This looks like it might be especially true in your case given the size of your site. Check out this article. I think you might find it very helpful. It's an explanation of soft 404 errors and what you should do about them.
Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up?
If the canonical tag is changed or removed, Google will find and reindex it next time it crawls your site (assuming you don't run out of crawl budget). You don't need to use WMT unless you're impatient and want to try to speed the process up.
-
Thanks Sandi, I did.. It's a great article and it answered many questions for me, but i couldn't really get clarity on my last two questions above..
-
Hey David
Check this MOZ Blog post about Rel=Canlonical appropriately named Rel=Confused?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi i have a few pages with duplicate content but we've added canonical urls to them, but i need help understanding what going on
hi google is seeing many of our pages and dupliates but they have canonical url on there https://www.hijabgem.com/index.php/maxi-shirt-dress.html has tags https://www.hijabgem.com/maxi-shirt-dress.html
On-Page Optimization | | hijabgem
has tagshttps://www.hijabgem.com/index.php/quickview/index/view/id/4693
has tags
my question is which page takes authority?and are they setup correct, can you have more than one link rel="canonical" on one page?0 -
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
Content hidden behind a 'read all/more..' etc etc button
Hi Anyone know latest thinking re 'hidden content' such as body copy behind a 'read more' type button/link in light of John Muellers comments toward end of last year (that they discount hidden copy etc) & follow up posts on Search Engine Round Table & Moz etc etc ? Lots of people were testing it and finding such content was still being crawled & indexed so presumed not a big deal after all but if Google said they discount it surely we now want to reveal/unhide such body copy if it contains text important to the pages seo efforts. Do you think it could be the case that G is still crawling & indexing such content BUT any contribution that copy may have had to the pages seo efforts is now lost if hidden. So to get its contribution to SEO back one needs to reveal it, have fully displayed ? OR no need to worry and can keep such copy behind a 'read more' button/link ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Pages I don't want
Hello Friends, A few days ago, I set up my wordpress site and everything is great so far, there's just one small problem. Wenn I search for site:mysite.com I see a lot of pages in the index I don't want to have on my site. It looks like demo pages from the theme I bought. Examples: http://mysite.com/themefusion_es_groups/group1/ http://mysite.com/slide/youtube/ http://mysite.com/lorem-ipsum-2/ http://mysite.com/slide/demo-5 What would be the best way to handle it? Simply delete the pages, then a soft 404 shows. Delete the pages and 301 them to the start page. Are they disapearing from the infex then? I am very grateful for tips regarding this! Thanks in Advance:)
On-Page Optimization | | grobro0 -
Duplication in landing page
This is driving me mad, I have a site that for some reason google and moz pick up the landing page as a duplicate. They see "mysite/" and "mysite/index.html" as two different pages and giving me warnings for duplication. I have no 301 included at this time and I am using foundation as the base. This is occurring both on a localhost test bed and live....... anyone got an idea how to correct.
On-Page Optimization | | AndyBirtles0 -
Duplicate Pages software
Hey guys, i was told few hours ago about a system that can take few of your keywords and automatically will create new links and pages (in the map file) for your website, so a website that was build with 20 pages( for example) will be shown to SE as a site with hundreds of pages, thing that should help the SEO IS anyone heard about such a software? is it legal? any advice that you can give on this mater? Thanks i.
On-Page Optimization | | iivgi0 -
Best Way to check for duplicate pages
With Google's updates we know they want to clean out duplicate content. i have been seeing the same crap spit out even word for word on different sites. Anyway how do you experienced SEO people test for dups on your own site as well as other sites. The only thing i can come up with is paying copyscape 5 cts a test. There has to be other ways. Advise/
On-Page Optimization | | joemas990 -
Why isn't SEOMoz using File Extensions (*.html etc) on any of their web page URLs?
...and what is the SEO benefit of this? This video from Matt Cutts suggests using file extentions, except for a directory.
On-Page Optimization | | magicrob0