Very, very confusing behaviour with 301s. Help needed!
-
Hi SEOMoz gang!
Been a long timer reader and hangerouter here but now i need to pick your brains.
I've been working on two websites in the last few days which are showing very strange behaviour with 301 redirects.
Site A
This site is an ecommerce stie stocking over 900 products and 000's of motor parts. The old site was turned off in Feb 2011 when we built them a new one. The old site had terrible problems with canonical URLs where every search could/would generate a unique ID e.g. domain.com/results.aspx?product=1234. When you have 000's of products and Google can find them it is a big problem. Or was. We launche the new site and 301'd all of the old results pages over to the new product pages and deleted the old results.aspx.
The results.aspx page didn't index or get shown for months. Then about two months again we found some certain conditions which would mean we wouldn't get the right 301 working so had to put the results.aspx page back in place. If it found the product, it 301'd, if it didn't it redirected to the sitemap.aspx page. We found recently that some bizarre scenerio actually caused the results.aspx page to 200 rather than 301 or 404. Problem. We found this last week after our 404 count in GWMT went up to nearly 90k. This was still odd as the results.aspx format was of the OLD site rather than the new. The old URLs should have been forgetten about after several months but started appearing again!
When we saw the 404 count get so high last week, we decided to take severe action and 301 everything which hit the results.aspx page to the home page. No problem we thought. When we got into the office on Monday, most of our product pages had been dropped from the top 20 placing they had (there were nearly 400 rankings lost) and on some phrases the old results.aspx pages started to show up in there place!!
Can anyone think why old pages, some of which have been 301'd over to new pages for nearly 6 months would start to rank? Even when the page didn't exist for several months? Surely if they are 301's then after a while they should start to get lost in the index?
Site B
This site moved domain a few weeks ago. Traffic has been lost on some phrases but this was mainly due to old blog articles not being carried forward (what i'll call noisy traffic which was picked up by accident and had bad on page stats).
No major loss in traffic on this one but again bizarre errors in GWMT. This time pages which haven't been in existence for several YEARS are showing up as 404s in GWMT. The only place they are still noted anywhere is in the redirect table on our old site. The new site went live and all of the pages which were in Googles index and in OpenSiteExplorer were handled in a new 301 table. The old 301s we thought we didn't need to worry about as they had been going from old page to new page for several years and we assumed the old page had delisted. We couldn't see it anywhere in any index.
So... my question here is why would some old pages which have been 301'ing for years now show up as 404s on my new domain?
I've been doing SEO on and off for seven years so think i know most things about how google works but this is baffling. It seems that two different sites have failed to prevent old pages from cropping up which were 301d for either months or years. Does anyone has any thoughts as to why this might the case.
Thanks in advance.
Andy Adido
-
Two problems you are experiencing and there is a common thing happening here.
your deleting/not implementing the "old" redirects.
as you know when you redirect page A... as long as there is a website on that domain. you have to always have that redirect. there is no threshold where if you implement a 301 redirect to a page for after a year for example there is no need for it.
This is a good video i posted in the references from matt cutts.
you can concluded "in a way" a 301 redirect will never be deleted especially "when Goog start receiving mix signals about this domain"
so the solution to your problem is you need to implement these old redirect you had before
hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need help related to SEO ?
Hi, Hope you guys are doing well! I need help from experts, I have a query that "Does google allows SEO for websites related to crypto currency ?" If yes, what is best way to do SEO for crypto currency website ?
Technical SEO | | JessicaThompson0 -
Please help, (going bananaz) trying to trouble shoot sitemap submitted to Bing
We need help to figure out what seems to be an error in our sitemap.
Technical SEO | | IMSvintagephotos
We have submitted the sitemap to BING and the sitemap includes 1,2 million pages that should be crawled. After initial submission, Bing says in the dashboard that 1,2 million pages have been submitted. Then always after 2-4 days the number drops to either 500.000 pages or like now 250.000 pages. Why is that? is there an error in our sitemap and BING in excluding pages, and it lowers the submitted number after going through them and discovering the error ?. We need to figure this out and fix so that BING can crawl and index all our 1,2 million pages. See the screenshot showing the BING dashboard.
We are also having issues with google but we can't figure out what is going on. Here are the sitemaps: https://imsvintagephotos.com/google_sitemap/sitemap.xml and here: https://imsvintagephotos.com/sitemap.xml. Your website is www.imsvintagephotos.com qqp6gj0 -
Help with site structure needed - any assistance welcomed!
Hi all, I am currently tasked with finding a better way to optimise our website ukdocumentstorage dot com. For starters, I would like to know what our site structure actually is at present. So I would like to be able to see which pages are linking to what at the moment & which pages have broken links on which I need to remove from the content. Hopefully I'd then be able to tidy up any errors that the site already has in its internal linking. Is there a way to do this easily? Or to have a graphical representation of the sites structure? I have just signed into our Webmaster Tools account and I am faced with a list of 10 'Crawl Errors' which are all 404 errors. Some of them do not actually exist anymore, but are still being linked to from a few pages according to WMT. For example, /industries_served_legal.htm is still being linked to from 5 of our pages (including /industries_served_local_authority.htm) However, this doesn't seem to be a case at all on the page as I can't find a link to /industries_served_legal.htm on /industries_served_local_authority.htm. Any advice as to why this is happening? Is there a way to find out easily where these broken links are situated on the page? And if I do actually manage to find our broken links, how would I go about removing them? The page /document_security.htm doesn't exist in our Sitewizard list of pages anymore, yet still exists online. How do I go about deleting this unecessary page properly? And does this harm our rankings? The document_security page also has an extra link on the top toolbar to a Document Management page, an addition which is no longer present on our up to date pages. Now this page (and the extra dropdown page when you hover over it) still exist on our list of Sitewizard pages at the moment, but we obviously no longer want to have these online anymore. How should I remove these? I understand that this is a lot of information, and so I would appreciate any help that can be given on these! Many thanks
Technical SEO | | janc0 -
Is there a need to have differen GWT account
Hi, in your opinion and practice, do you think that it is necessary not to put too many web sites that you optimize in the same GWT account? Can this always give Google a signal that there is a strong relation between this websites?
Technical SEO | | vladokan0 -
Duplicate index.php/webpage pages on website. Help needed!
Hi Guys, Having a really frustrating problem with our website. It is a Joomla 1.7 site and we have some duplicate page issues. What is happening is that we have a webpage, lets say domain.com/webpage1 and then we also have domain.com/index.php/webpage1. Google is seeing these as duplicate pages and is causing me some real SEO problems. I have tried setting up a 301 redirect but it wn't let me redirect /index.php/webpage1 to /webpage1. Anyone have any ideas or plugins that can be used to sort this out? Any help will be really appreciated! Matt.
Technical SEO | | MatthewBarby0 -
Seek help correcting large number of 404 errors generated, 95% traffic halt
Hi, The following GWT screen tells a bit of the story: site: http://bit.ly/mrgdD0 http://www.diigo.com/item/image/1dbpl/wrbp On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT. My traffic had been steady at about 1000 clicks/day. At midnight on 2/10, google traffic completely halted, down to 11 clicks/day. I submitted a recon request and was told 'no manual penalty' Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then. By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th. I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors! I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors. However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'. I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$. Assuming that the large number of 404 internal errors is the reason for the sudden shutoff... How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'.. Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP?? thanks
Technical SEO | | mantucket0 -
Some backlinks not getting picked up by OSE -- confused to why
I've had what has ended up being a nightmare SEO campaign. We have sunk a ton of time into link building for the site and saw little to no traction. Just as a test, I have the site on our portfolio as a site we designed years ago. The link to the site is old. Somehow, this link does not show up on OSE, and it seems to get no credit. http://www.happydogwebproductions.com/item/spine-and-sports.html The backlink, and troubled campaign is www.spineandsportschiro.com There should be a few more links too that aren't getting picked up, especially the ones I have noticed using Whitespark local citation tool. Has Google got picky with all of this? How come this site seems to be getting no credit?
Technical SEO | | Boogily0 -
Ranked on Page 1, now between page 40-50... Please help!
My site, http://goo.gl/h0igI was ranking on page one for many of our biggest keywords. All of a sudden, we completely fell off. I believe I'm down somewhere between page 40-50. I have no warning or error messages in webmaster tools. Can anyone please help me identify what the problem is? This is completely unexpected and I don't know how to fix it... Thanks in advance
Technical SEO | | Prime850