How to fix issues from 301s
-
Case:
We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits.
Issue:
My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline.
Proposed resolution:
I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up.
Question 1:
Is the assumption that the decline could be because of missed authority signals reasonable?
Question 2:
Could the proposed solution be harmful?
Question 3:
Will the proposed solution be adequate to resolve the issue?
Any help would be sincerely appreciated.
Thank you in advance,
David
-
I did run a search on our old pages in the SERPs and found a large number of them are still showing. I also found most of our new pages, some where both the old and new were represented. I have also seen a lot of our positions go from page one to not in the top 100, these are all from pages which were 301ed to a nearly exact replica in the new version. I had originally thought Google had hit them, but not updated their listing to the new version. I am now thinking that they are just being ignored, and have not had their 301 picked up.
-
Hi David,
Can you run a: site:yourwebsite.com search in Google and report back? ie. find some of those pages and then check the cache date on them if the URLs are the same? Similarly, you can check the URL structure if that's changed by combing through all the google results of your sites pages in the SERP to see. If old URLs that have been 301'd exist, then it hasn't been recrawled and updated.
Once we know whether its an issue of them not having been recrawled, you'll better understand what the possible issue is and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need multiple 301s to preserve SEO
Many years ago I created website www.original.com. Two years ago I redirected website www.original to www.neworiginal.comusing 301 redirects. I have now created www.rebranded.com. I want to maintain all SEO value. Should I redirect both www.original.comand www.new original.com to www.rebranded.com? Or do I only need to redirect one of them and if only one which one? If I need only to redirect one can I delete the other, why or why not. Of course the url are fictitious. I truly appreciate your help
Intermediate & Advanced SEO | | PhotoStl0 -
A new website issue
Hello everybody,
Intermediate & Advanced SEO | | mtmaster
I have started a new website 22 days ago at the beginning of this month and i have long articles. I think this should make the site appear in search results for long tail keywords even if they are not very relevant but as you can see in the attached image from my webmaster tools the impression count has suddenly increased to 100 then significantly decreased again. Even when i cancel "filter" option. Is this normal for a 3 weeks old website? or there is something i have to check? thanks. cLMa04l.jpg0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
Google Cache Redirection Issue? Or awaiting DNS Propagation?
We have just launched an Australian version of our .com site. The domain example.com.au used to redirect automatically to example.com. We updated the DNS to point to our new Australian site on 13 August and have started a two week soft launch to iron out issues and allow us to start the indexing process. The issue is, when I'm checking to see if we have been crawled yet, if I do I search for: cache:http://www.example.com.au The cache automatically redirects to http://www.example.com and shows the cache for that site. The date on the cache is: 14 Aug 2012 20:47:34 GMT. My question is, have I missed something during this process or are we just waiting for the DNS change to fully propagate?
Intermediate & Advanced SEO | | Benj250 -
Help, really struggling with fixing mistakes post-Penguin
We had previously implemented a strategy of paying for lots of links and focusing on 3 or 4 keywords as our anchors, which used to REALLY work (I know, I know, bad black hat strategy - I have since learned my lesson). These keywords and others have since plummeted up to 100 spots since Panda 3.3 and Penguin. So I'm trying to go in and fix all our mistakes cuz our domain is too valuable to us just to start over from scratch. Yesterday I literally printed a 75 page document of all of our links according to Open Site Explorer. I have been going in and manually changing anchor text wherever I can, and taking down the very egregious links if possible.This has involved calling and emailing webmasters, digging up old accounts and passwords, and otherwise just trying to diversify our anchor text and remove bad links. I've also gone into our site and edited some internal links (also too weighty on certain keywords) and removed other links entirely. My rankings have gone DOWN more today. A lot. WTF does Google want? Is there something I'm doing wrong? Should we be deleted links from all private networks entirely or just trying to vary the anchor text? Any advice greatly appreciated. Thanks!
Intermediate & Advanced SEO | | LilyRay0 -
How to fix duplicated urls
I have an issue with duplicated pages. Should I use cannonical tag and if so, how? Or should change the page titles? This is causing my pages to compete with each other in the SERPs. 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' is also used on http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |
Intermediate & Advanced SEO | | Melia0 -
Dynamically generated page issues
Hello All! Our site uses dynamically generated pages. I was about to begin the process of optimising our product category pages www.pitchcare.com/shop I was going to use internal anchor text from some high ranking pages within our site but each of the product category pages already have 1745 links! Am I correct in saying that internal anchor text links works to a certain point? (maybe 10 or so links) So any new internal anchor text links will count for nothing? Thanks Todd
Intermediate & Advanced SEO | | toddyC0