How to fix issues from 301s
-
Case:
We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits.
Issue:
My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline.
Proposed resolution:
I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up.
Question 1:
Is the assumption that the decline could be because of missed authority signals reasonable?
Question 2:
Could the proposed solution be harmful?
Question 3:
Will the proposed solution be adequate to resolve the issue?
Any help would be sincerely appreciated.
Thank you in advance,
David
-
I did run a search on our old pages in the SERPs and found a large number of them are still showing. I also found most of our new pages, some where both the old and new were represented. I have also seen a lot of our positions go from page one to not in the top 100, these are all from pages which were 301ed to a nearly exact replica in the new version. I had originally thought Google had hit them, but not updated their listing to the new version. I am now thinking that they are just being ignored, and have not had their 301 picked up.
-
Hi David,
Can you run a: site:yourwebsite.com search in Google and report back? ie. find some of those pages and then check the cache date on them if the URLs are the same? Similarly, you can check the URL structure if that's changed by combing through all the google results of your sites pages in the SERP to see. If old URLs that have been 301'd exist, then it hasn't been recrawled and updated.
Once we know whether its an issue of them not having been recrawled, you'll better understand what the possible issue is and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console > Security Issues
Hi all, *Admin please feel free to remove or add this to any existing post. I have searched the community for any similar questions. While checking in the Google Search Console, under the "Security Issues" (lone section) I have found Google pointing out specific pages of our website where the message we are seeing is "Content injection - These pages appear to be modified by a hacker with the intent of spamming search results." The Learn More link takes us to https://developers.google.com/webmasters/hacked/docs/hacked_with_spam?ctx=SI&ctx=BHspam&rd=1 We've never injected spam code or have not been injected with any spammy code so what baffles me is why would Google pick this up when we have mentioned to them very clear that our code is secure and not hacked. Has anyone received a similar message and had any luck removing the message correctly? Thanks in advance!
Intermediate & Advanced SEO | | SP10 -
Unnecessary 301s?
hi mozzers, I'm doing an audit on a website. I detected over 60 301s of this nature: www.example.com/help 301d to www.example.com/help/. I believe these are completely useless and increase page load time. Am I right? should i kill those 301s? Thanks
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Site Redesign Inconsistent Trailing Slash Issue
I'm looking at a site that has implemented trailing slashes inconsistently across multiple pages. For instance:
Intermediate & Advanced SEO | | GrouchyKids
http://www.examplesite.co.uk/ (WITH)
http://www.examplesite.co.uk/product-range (WITHOUT)
http://www.examplesite.co.uk/product (WITHOUT)
http://www.examplesite.co.uk/blog/ (WITH)
http://www.examplesite.co.uk/blog/blog-article/ (WITH) The blog was created later in Wordpress which is one of the reasons why this issue exists. Looking at the inbound links unsurprisingly the lions share go to the home page but lots of other pages have links as well, particularly the product pages, no to many to the blog pages. This pattern is similar in terms of which pages rank, the home page ranks well for a variety of phrases, the product pages also do quite well. I know that ideally the URL's should be identical to the existing site, or if you have to you should 301 redirect old to new. The client wants to switch the whole site over to Wordpress which will be default implement a consistent URL structure across the board, thereby changing at least some of the URL's no matter what I do. I remember a Matt Cutts video that stated that even a 301 redirect will loose a clicks worth of link juice see: https://www.youtube.com/watch?v=Filv4pP-1nw The existing site has a poor UX compared to the new proposed design so this should help us. Has anyone got any experience with a similar issue or any advice about how best to proceed?0 -
Potential Pagination Issue/ Duplicate content issue
Hi All, We upgraded our framework , relaunched our site with new url structures etc and re did our site map to Google last week. However, it's now come to light that the rel=next, rel=Prev tags we had in place on many of our pages are missing. We are putting them back in now but my worry is , as they were previously missing when we submitted the , will I have duplicate content issues or will it resolve itself , as Google re-crawls the site over time ?.. Any advice would be greatly appreciated? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
How can I fix "Too Many On Page Links"?
One of the warnings from SEO Moz says that we have "too many on page links" on a series of pages on my website. The pages it's giving me these warnings on are on my printing sample pages. I'm assuming that it's because of my left navigation. You can see an example here: http://www.3000doorhangers.com/door-hanger-design-samples/deck-and-fence-door-hanger-samples/ Any suggestions on how to fix this warning? Thanks!
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Google Cache Redirection Issue? Or awaiting DNS Propagation?
We have just launched an Australian version of our .com site. The domain example.com.au used to redirect automatically to example.com. We updated the DNS to point to our new Australian site on 13 August and have started a two week soft launch to iron out issues and allow us to start the indexing process. The issue is, when I'm checking to see if we have been crawled yet, if I do I search for: cache:http://www.example.com.au The cache automatically redirects to http://www.example.com and shows the cache for that site. The date on the cache is: 14 Aug 2012 20:47:34 GMT. My question is, have I missed something during this process or are we just waiting for the DNS change to fully propagate?
Intermediate & Advanced SEO | | Benj250 -
ECommerce products duplicate content issues - is rel="canonical" the answer?
Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.
Intermediate & Advanced SEO | | Confetti_Wedding0