Google Crawl Errors from vbseo change
-
We have vbseo setup on our site and for some reason a setting was changed unexpectedly and was un-noticed where it changed the URL of all the pages and so none of our pages were getting indexed by google any longer due to 401 errors. Most of our SE traffic fell off.
We discovered the issue a couple weeks ago and we changed the setting back so that the URLs are the same as they were originally before but in Google webmasters it's still showing crawl errors and our search engine traffic hasn't recovered at all. We have sitemaps being sent daily.
-
That would seem a bit longer than I would have expected. I would consider adding 301 redirects for all of the "temporary" pages so that those point back to the original/current pages.
-
Thanks for the feedback. We made the change about 5-6 weeks ago and still not seeing a recovery but the number of links indexed. Here's a screenshot of the drop and non-recovery.
-
You'll find that Google will keep retrying those broken links for months. The frequency will go down, and eventually they'll stop trying.
In the meantime, the changed URLs undoubtedly made a mess of Google's calculations for internal link juice, and you should expect it to take about a month for that to settle back to normal as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No-return tag error
Am receiving a no-return tag error on Search Console for pages in our site that are unrelated. Originating Page: https://www.eginnovations.com/in-the-news/performance-assurance-a-key-to-virtual-desktop-success Alternate URL: https://www.eginnovations.com/fr/ There is a link to the /fr/ page in the language version at the top of the page but I can't figure out why this would be throwing an error from the originating page. Any help would be appreciated!
On-Page Optimization | | eGInnovations0 -
Removing old URLs from Google
We rebuilt a site about a year ago on a new platform however Google is still indexing URL's from the old site that we have no control over. We had hoped that time would have 'cleaned' these out but they are still being flagged in HTML improvements in GWT. Is there anything we can do to effect these 'external' dropping out of the indexing given that they are still being picked up after a year.
On-Page Optimization | | Switch_Digital0 -
Google webmaster markup validation error
Type: Schema product Property: Image Error: Missing required field "name (fn)". Google Webmaster is showing this error when I try to validate markup on webmaster. This is my domain - www(dot)wishpicker(dot)com Would be great if someone could please help with this. Thanks
On-Page Optimization | | bansheeviv
Prakul0 -
Google crawler showing cache of another page
For the page http://www.thinkdigit.com/top-products/Laptops-and-PCs/top-10-laptops-124.php google is showing another page in cache (http://www.thinkdigit.com/top-products/Ultrabooks/top-10-ultrabooks-153.php). Please let me know how this happened and how to correct it.
On-Page Optimization | | 9dot90 -
Duplicate Page Titles in Crawl Errors (although Google is rewriting in serps ??)
Hi Im working on a client/project and crawl report is showing thousands of dupe page titles In the case of the blog/news section its aprox 50 since aprox 50 posts and they all have the same meta-title: "Brand News | Brand" as opposed to: "Title Unique to Page/Topic/KW Relating to Content | Brand" Since these are the main content pages we want to rank (in addition to the main site category pages) then i have instructed dev must prioritise populating these pages meta-titles with the actual post/article titles, as per the latter version of the above example. (I should mention that i have requested they fix all dupe titles but main content pages are the priority). Whilst this will reduce the number of dupe titles in crawl error/warning report which is a good thing, is it actually likely to increase the ranking of these news/content pages given that Google does seem to be rewriting the titles correctly in the serps based on the page content ? Many Thanks in advance for your input
On-Page Optimization | | Dan-Lawrence0 -
My Meta Description changes when i use different keyword in google search.
Hello everyone, I have a question for the community. I have a website with several articles and news that i manage. I set specific meta descriptions for every page but when i search in google it gives me back different meta descriptions depending on the keyword that i use to search. What i notice is that google looks in my page for the most relevant part of the text that combines with my keyword and gives me back that result. I thought that this only happen when i have an empty meta description. Anyone felt the same ? Best Ricardo www.meuportalfinanceiro.pt
On-Page Optimization | | Adclick0 -
How to design a site map page for users (not for Google)
I would like to design a site map for my visitors so they can have a quick view on the whole content of the website. 2 questions : 1 - is this kind of site map can help in terms of SEO ? 2 - if so, what are the best practices to design it ? Thanks in advance.
On-Page Optimization | | betadvisor0 -
Shall Google index a search result?
Hi, I've a website with about 1000 articles.Each article has one ore more keywords / tags. So I display these keywords at the article page and put a link to the intern search engine. (Like a tag cloud) The search engine lists als articles with the same keyword and creates a result page. This result page is indexed by Google. The search result contains the title of the article, a short description (150-300 chars.) and a link to the article. So, Google believes, that there are about 5.000 pages instead of 1.000 because auf the link to the search result pages. The old rule was for me: More pages in Google = better. But is this still true nowadays? Would be a "noindex, follow" better on these search result pages? (Is there a way to tell Google that this is a search result page?) Best wishes, Georg.
On-Page Optimization | | GeorgFranz0