How to handle blank, auto generated system pages/urls
-
Hi Guys
Our backend system has been creating listing pages based on out of date and irrelevant data meaning we have hundreds of thousands of pages that are blank but currently indexable and active. They're almost impossible to access from the front end and have 0 traffic pointing at them but you can access these pages if you have the URL and i'm pretty sure due to the site architecture, google is crawling them regardless. For the most part, I think its likely best to 301 these pages to the most closely related page on the site but I'm concerned we're wasting crawl budget here. We don't want these pages to be crawled or found. Would a sound solution be to make them inactive, no-index and create a custom 404 in the event anyone (or the crawler) managed to get to them? Would this enormous increase in 404 pages cause us issues?
Many thanks
-
Thanks for such a speedy reply! Its such a daunting task as there's literally thousands and thousands of pages so we want to be sure we're doing the right thing. I appreciate your help. Now i'll investigate blocking within the robots.txt and using google search console to remove the URLs
-
First, do not 404 them, use a 410 error code instead as that denotes intended permanent deletion. In addition, I would also block the files/folder in robots.txt. Finally, I would use Google Search Console to remove these URLs. Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I disallow all URL query strings/parameters in Robots.txt?
Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters: /Mulligan-Practitioner-CD-ROM
Intermediate & Advanced SEO | | jmorehouse
/Mulligan-Practitioner-CD-ROM?cat_id=87
/Mulligan-Practitioner-CD-ROM?kw=CROM Additionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors. As I see it, I have two options: Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result). Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original. Any thoughts?0 -
Same URLS different CMS and server set up. Page Authority now 1
We have moved a clients website over to a new CMS and onto a new server. The Domain and URLs on the main pages of the website are exactly the same so we did not do any 301 re directs. The overall Domain Authority of the site and the Page Authority of the Homepage, while having dropped a bit seem OK. However all the other pages now have a Pagerank of 1 I'm not exactly sure what the IT guys have done but there was some re routing on the server level applied. The move happened around the end of December 2014 And yes traffic has dropped significantly Any ideas?
Intermediate & Advanced SEO | | daracreative0 -
Mobile Search Results Include Pages Meant Only for Desktops/Laptops
When I put in site:www.qjamba.com on a mobile device it comes back with some of my mobile-friendly pages for that site(same url for mobile and desktop-just different formatting), and that's great. HOWEVER, it also shows a whole bunch of the pages (not identified by Google as mobile-friendly) that are fine for desktop users but are not supposed to exist for the mobile users, because they are too slow. Until a few days ago those pages were being redirected for mobile users to the home page. I since have changed that to 404 not founds. Do we know that Google keeps a mobile index separate from the desktop index? If so, I would think that 404 should work.. How can I test whether the 404 not founds will remove a url so they DON'T appear on a mobile device when I put in site:www.qjamba.com (or a user searches) but DO appear on a desktop for the same command.
Intermediate & Advanced SEO | | friendoffood0 -
Change url structure and keeping the social media likes/shares
Hi guys, We're thinking of changing the url structure of the tutorials (we call it knowledgebase) section on our website. We want to make it shorter URL so it be closer to the TLD. So, for the convenience we'll call them old page (www.domain.com/profiles/profile_id/kb/article_title) and new page (www.domain.com/kb/article_title) What I'm looking to do is change the url structure but keep the likes/shares we got from facebook. I thought of two ways to do it and would love to hear what the community members thinks is better. 1. Use rel=canonical I thought we might do a rel=canonical to the new page and add a "noindex" tag to the old page. In that way, the users will still be able to reach the old page, but the juice will still link to the new page and the old pages will disappear from Google SERP and the new pages will start to appear. I understand it will be pretty long process. But that's the only way likes will stay 2. Play with the og:url property Do the 301 redirect to the new page, but changing the og:url property inside that page to the old page url. It's a bit more tricky but might work. What do you think? Which way is better, or maybe there is a better way I'm not familiar with yet? Thanks so much for your help! Shaqd
Intermediate & Advanced SEO | | ShaqD0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Automotive part / OEM / Manufacturer numbers
Hi All, What's the best way to optimise pages for OE / Manufacturer Part numbers? Disclaimer: All part numbers in this post are fictional. I dont want this post out ranking my client for real part numbers 🙂 Take this for Throttle Body for example: WOODYS S-AB-Q.123.53G This is the main part number from WOODYS (the manufacturer). However, these are all variations of exactly the same product: Woodys 2.78972.11.0 Woodys 2.78972.16.0 Woodys 2.78972.20.0 Woodys 2.78972.26.0 Oh, and car brands use OE numbers for these parts, such as: VWA 9808e40923G VWA 9808e40923L VWA 9808e40923M VWA 9808e40923P VWA 9808e40923Q These internal part numbers are vitally important as most of my clients customers are garages/mechanics so they're very likely to search on OE numbers. So, would you suggest: Optimising 10 different pages for the same product (using the part numbers in the URL, Title and H1). The problem is there's no unique content for these pages, only the part number varies, so this would likely get penalised for dupe content, or not enough unique content. Optimising one page for all terms. If so, how do you suggest doing this to ensure all part/OE numbers rank well and part numbers are prominent in the SERPS?
Intermediate & Advanced SEO | | seowoody
Could Schema.org help here by marking up these EO numbers with the isSimilarTo property of the Product type? I'm trying to ensure these part number get equal presence in the SERP snippet when searched for, even though I can't physically include all these numbers in the Title tag, URL and H1 of one page. 3. Something else? Thanks, Woody 🙂1 -
Consolidating MANY separate domains into a much better, single URL: Should I point a landing page or redirect to the new site?
I am consolidating a site for a client who previously, and very foolishly, broke up their domains like so: companyparis.com companyflorence.com companyrome.com etc... I am now done with the new site, which will be at: company.eu with pages as appropriate: company.eu/paris company.eu/florence company.eu/rome This domain, although not entirely new, does not have much authority or rank. In terms of SEO and link-building, is it better to redirect the old domain to the specific page on the new domain: companyparis.com --> company.eu/paris or... is it better to put a landing page at the old domain LINKING to the page on the new domain: companyparis.com --> landing page linking to --> company.eu/paris
Intermediate & Advanced SEO | | thongly0 -
Dynamic URLs Appearing on Google Page 1\. Convert to Static URLs or not?
Hi, I have a client who uses dynamic URLs thoughout his site. For SEO purposes, I've advised him to convert dynamic URLs to static URLs whenever possible. However, the client has a few dynamic URLs that are appearing on Google Page 1 for strategically valuable keywords. For these URLs, is it still worth it to 301 them to static URLs? In this case, what are the potential benefits and/or pitfalls?
Intermediate & Advanced SEO | | mindflash0