We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
-
Hi everyone!
We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate.
We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap.
Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed.
Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it?
Thanks in advice for your help!
Jeff
-
Awesome - thanks for your help Mike! I really appreciate it!
Jeff
-
You could but its not completely necessary to go through all those sub-pages to 410 them. While a 410 Gone response is a stronger signal, those pages serving 404s will eventually be removed from the crawl and/or SERPs by the bots anyway. So if those pages are just dynamically-generated flak, and don't provide anything of benefit, then leave them as 404s and don't worry about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URLs are not indexed
My website has 0.5 million pages with urls like this- **http://www.mycity4kids.com/Delhi-NCR/collage-painting-classes-%3cnear%3e-shalimar-bagh ****, **none of these urls are indexed. Question 1- What can be the possible reason for this issue? Users see this url as : http://www.mycity4kids.com/Delhi-NCR/collage-painting-classes-<near>-shalimar-bagh</near>
Intermediate & Advanced SEO | | prsntsnh
The symbol "<" and ">" get converted into "%3c" and "%3e" respectively, is this the reason for these urls not getting indexed?0 -
Menu & Sub menu structure - section specific sub menu?
Hi, At present I have a set of regions across the nav bar with drop downs for towns within these regions. This totals about 60 links. This set up is pretty much site wide and the site ranks well for many of these towns. However, this is quite a lot of links per page (without the page specific/content links) and I am wondering about having just a horizontal sub menu of towns appear below the nav bar only specific to the selected region. Do you think cutting down the number of menu links will be beneficial to the content page's ranking. Sorry if that isn't very clear, can't think of a better way of putting it. TIA,
Intermediate & Advanced SEO | | Cornwall
Chris0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Report card shows many F's. How do I specify keywords for pages?
I have been doing general optimization for on-page, but still have many F's because SEOMoz considers the pages to be weak for keywords that are anyway not relevant. Is there a way to tease out keywords for specific pages so I can get a more accurate report card?
Intermediate & Advanced SEO | | Ocularis1 -
Does having a trailing slash make a url different than the same url without the trailing slash?
Does having a trailing slash make a url different than the same url without the trailing slash? www.example.com/services Or www.example.com/services**/** Does Google consider these to be the same link or does Google treat them as different links?
Intermediate & Advanced SEO | | webestate0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0 -
Should I 301 Poorly Worded URL's which are indexed and driving traffic
Hi, I'm working on our sites structure and SEO at present and wondering when the benefit I may get from a well written URL, i.e ourDomain / keyword or keyphrase .html would be preferable to the downturn in traffic i may witness by 301 redirecting an existing, not as well structured, but indexed URL. We have a number of odd looking URL's i.e ourDomain / ourDomain_keyword_92.html alongside some others that will have a keyword followed by 20 underscores in a long line... My concern is although i would like to have a keyword or key phrase sitting on its own in a well targeted URL string I don't want to mess to much with pages that are driving say 2% or 3% of our traffic just because my OCD has kicked in.... Some further advice on strategies i could utilise would be great. My current thinking is that if a page is performing well then i should leave the URL alone. Then if I'm not 100% happy with the keyword or phrase it is targeting I could build another page to handle the new keyword / phrase with the aim of that moving up the rankings and eventually taking over from where the other page left off. Any advice is much appreciated, Guy
Intermediate & Advanced SEO | | guycampbell0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0