Crawl errors - Deleted wordpress pages should you redirect them if no backlinks
-
Question answered
-
Hello Logan,
Great thank you for getting back to me. I have gone through and created 410 redirects for all the old content. I guess only time will tell if this is the write solution.
Regards
Rob
-
Hi Rob,
Since you've got no links pointing to these pages, a 410 would be your best bet. This will get them removed from the index the quickest, and you'll start to see these errors in Search Console drop. Doing 301 redirects would also get them removed from the index, but that's also to slow down your performance, and since these pages can't be accessed other than SERPs anyway, 301s aren't going to provide much long-term value.
Here's some more info on the difference between 404s and 410s: https://searchenginewatch.com/sew/how-to/2340728/matt-cutts-on-how-google-handles-404-410-status-codes
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
520 Error from crawl report with Cloudflare
I am getting a lot of 520 Server Error in crawl reports. I see this is related to Cloudflare. We know 520 is Cloudflare so maybe the Moz team can change this from "unknown" to "Cloudflare 520". Perhaps the Moz team can update the "how to fix" section in the reporting, if they have some possible suggestions on how to avoid seeing these in the report of if there is a real issue that needs to be addressed. At this point I don't know. There must be a solution that Moz can provide like a setting in Cloudflare that will permit the Rogerbot if Cloudflare is blocking it because it does not like its behavior or something. It could be that Rogerbot is crawling my site on a bad day or at a time when we were deploying a massive site change. If I know when my site will be down can I pause Rogerbot? I found this https://developers.cloudflare.com/support/troubleshooting/general-troubleshooting/troubleshooting-crawl-errors/
Technical SEO | | awilliams_kingston0 -
Is there a way to see Crawl Errors older than 90 days in Webmaster Tools?
I had some big errors show up in November, but I can't see them anymore as the history only goes back 90 days. Is there a way to change the dates in Webmaster Tools? If not, is there another place I'd be able to get this information? We migrated our hosting to a new company around this time and the agency that handled it for us never downloaded a copy of all the redirects that were set-up on the old site.
Technical SEO | | b4cab0 -
Should I delete a page or remove links on a penalized page?
Hello All, If I have a internal page that has low quality links point to it or a penality. Can I just remove the page, and start over versus trying to remove the links? Over time wouldn't this page disapear along with the penalty on that page? Kinda like pruning a tree? Cutting off the junk limbs so other could grow stronger, or to start new fresh ones. Example: www.domain.com Penalized Internal Page: (Say this page is penalized due to keyword stuffing, and has low quality links pointing to it like blog comments, or profiles) www.domain.com/penalized-internal-page.com Would it be effective to just delete this page (www.domain.com/penalized-internal-page.com) and start over with a new page. New Internal Page: www.domain.com/new-internal-page.com I would of course lose any good links point to that page, but it might be easier then trying to remove old back links. Thoughts? Thanks! Pete
Technical SEO | | Juratovic0 -
301 redirect domain to page on another domain
Hi, If I wanted to do a 301 permanent redirect on a domain to a page on another domain will this cause any problems? Lets say I have 4 domains (all indexed with content), I decide to create a new domain with 4 pages, one for each domain. I copy the content from the old domains to the relevant page on the new domain and set it live. At the same time as setting the new site live I do a 301 permanent redirect on the 4 domains to the relevant pages on the new domain. What happens if Google indexes the new site before visiting the redirected domains, could this cause a duplicate content penalty? Cheers
Technical SEO | | activitysuper0 -
Delete old site but redirect domain to a new domain and site
I just have a quick query and I have a feeling about what the answer is so just wanted to see what you guys thought... Basically I am working on a client site. This client has a few other websites that are divisions of their company. However these divisions/websites are no longer used. They are wanting to delete the websites but redirect the domains to their name main website. They believe this will pass on SEO benefits as these old division sites are old and have a good PR and history. I'm unsure for DEFINITE, which way is correct?
Technical SEO | | Weerdboil0 -
Seomoz is showing duplicate page content for my wordpress blog
Hi Everyone, My seomoz crawl diagnostics is indicating that I have duplicate content issues in the wordpress blog section of my site located at: http://www.cleversplash.com/blog/ What is the best strategy to deal with this? Is there a plugin that can resolve this? I really appreciate your help guys. Martin
Technical SEO | | RogersSEO0 -
Can Search engines crawl this page
Hi guys, To put a long story short we have had to make a copy of our site and put it on another domain so in essence there are 2 copies of our site on the web.What we have done is put a username and password on the homepage - http://www.ughhwiki.co.uk/ now i just want to be 100% sure that the search engines cannot crawl this? Thank you Jon
Technical SEO | | imrubbish0