Spike in server errors
-
Hi, we've recently changed shopping cart platforms. In doing so a lot of our URL's changed, but I 301'ed all of the significant landing pages (as determined by G Analytics) prior to the switch.
However, WMT is warning me about this spike in server errors now with all the pages that no longer exist. However they are only crawling them because they used to exist/are linked from pages that used to exist. and no longer actually exist.
Is this something I should worry about? Or let it run its course?
-
1. Have you submitted the new pages via a sitemap/fetch as Google? Could be they are still trying to crawl all the old pages.
2. Have you manually checked some of the pages to see if the redirect is working? If they are, could be you have an error in your reporting through GWT (it's not perfect, and will often give warnings about old pages, even those you 301)
Once you submit the pages, and have verified your pages are redirecting properly let it run it's course. Be sure to double check in analytics that you are not losing traffic due to any non-functioning redirects.
-
I would spend some time investigating the errors. If you 301 redirected everything you should not be getting 404 errors, so you might have an issue with your redirects or an issue with your new cart.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Coyscape showing content duplication error even after implementing 301 redirect ?
We are maintaining the corporate website of one of our prestigious clients "FineTech Toolings" (http://www.finetechtoolings.in). Recently I had raised a question regarding "2 websites running paralley in 2 diferent domains, i.e. 1 organisation having 2 different websites on 2 different domains". Recently my domain changed from http://www.finetechtoolings.co.in to http://www.finetechtoolings.in via 301 redirect, but still I am facing content duplication issue as per Copyscape. Hence I am having a small doubt regarding the same. Please note the following question very carefully and provide me the exact problem and the solution for the same: Even though I have implemented 301 redirect (http://www.finetechtoolings.co.in is redirected to http://www.finetechtoolings.in), which is completely ok as per the SEO rules, why is copyscape still showing that duplicate content exists in the former website? I think I am clear enough with my question.
Technical SEO | | KDKini0 -
301 Redirect keep html files on server?
Hello just one quick question which came up in the discussion here: http://moz.com/community/q/take-a-good-amount-of-existing-landing-pages-offline-because-of-low-traffic-cannibalism-and-thin-content When I do 301 redirects where I put together content from 2 pages, should I keep the page/html which redirects on the server? Or should I delete? Or does it make no difference at all?
Technical SEO | | _Heiko_0 -
Can increase in crawl errors in GWT) be caused by input fields and jquery?
Dear Mozzerz We took over www.urgiganten.dk not long ago and last week we opened up for indexation, after having taken the old website down for a couple of months. One week after opening for indexation we saw a huge increase in crawl errors.Google is discovering some weird links to e.g http://www.urgiganten.dk/30-garmin-urremme/ which returns a 404. In GWT we are told that we are linking to this url from http://www.urgiganten.dk/garmin-urremme. But nowhere on http://www.urgiganten.dk/garmin-urremme will you find this link. However you will find the following script in the source code, which is the only code part that contains "/30-garmin-urremme/":Can it be true that google take the id and adds it to our tld to form a url? We have seen quite a lot of these errors not only on Urgiganten.dk but also some of our other websites!
Technical SEO | | urgiganten0 -
Crawl errors: 301 (permanent redirect)
Hi, here are some questions about SEO Crawl Diagnostics. We've recently found out this 301 (permanent redirect) errors in our website and we concluded that the two factors below are the causes. 1. Some of our URLs that has no / at the end is automatically redirected to the same URL but with / at the end. 2. For SEO reasons we have designed our website in a way that when we type in a URL it will automatically redirect to a more SEO friendly URL. For example, if one of the URLs is www.example.com/b1002/, it will automatically redirect to www.example.com/banana juice/. The question is, are these so significant for our SEO and needs to be modified? One of the errors in our blog was having too many on-page links. Is this also a significant error and if so, how many on-page links are recommended from the SEO perspective? Thanks in advance.
Technical SEO | | Glassworks0 -
Getting 404 error when open the cache link of my site
My site is hazanstadservice.se and when I am trying to open this to check the cache date i got a 404 error from google. I don't know why ? The cache page url is http://webcache.googleusercontent.com/search?q=cache:j99uW96RuToJ:www.hazanstadservice.se/+&cd=1&hl=en&ct=clnk.
Technical SEO | | Softlogique0 -
Website of only circa 20 pages drawing 1,000s of errors?
Hi, One of the websites I run is getting 1,000s of errors for duplicate title / content even though there are only approximately 20 pages. SEOMoz seems to be finding pages that seem to have duplicated themselves. For example a blog page (/blog) is appearing as /blog/blog then blog/blog/blog and so on. Anyone shed some light on why this is occurring? Thanks.
Technical SEO | | TheCarnage0 -
Does server affect indexing speeds?
A bit of a strange question this one: I have a domain which, when on my Dutch server, can get new blog posts indexed and ranking in less than 10 mins using the pubsubshubbub plugin. However, I moved the blog and domain to a UK dedicated server and continued to post. Days later none of these posts were indexed. I then moved the domain back to the Dutch server to test this, I posted in the blog and once again, indexed and ranking in 20 mins or so. To cut a long and tedious story short; In a bid to be closer to my customers I moved the domain to a UK VPS three days back. I posted but no posts are indexed. Anyone else experienced anything like this? Generally I don't move domains back and forward so much but wanted to test this out. The Ducth server is a 16 core 24gb Direct Admin dedicated, the two UK servers were both running Cpanel. I understand that it would be best to host as close to possible to the customers but the hardship of getting posts indexed in the UK is becoming a problem. Thanks, Carl
Technical SEO | | Grumpy_Carl1 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70