How to fix issues from 301s
-
Case:
We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits.
Issue:
My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline.
Proposed resolution:
I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up.
Question 1:
Is the assumption that the decline could be because of missed authority signals reasonable?
Question 2:
Could the proposed solution be harmful?
Question 3:
Will the proposed solution be adequate to resolve the issue?
Any help would be sincerely appreciated.
Thank you in advance,
David
-
I did run a search on our old pages in the SERPs and found a large number of them are still showing. I also found most of our new pages, some where both the old and new were represented. I have also seen a lot of our positions go from page one to not in the top 100, these are all from pages which were 301ed to a nearly exact replica in the new version. I had originally thought Google had hit them, but not updated their listing to the new version. I am now thinking that they are just being ignored, and have not had their 301 picked up.
-
Hi David,
Can you run a: site:yourwebsite.com search in Google and report back? ie. find some of those pages and then check the cache date on them if the URLs are the same? Similarly, you can check the URL structure if that's changed by combing through all the google results of your sites pages in the SERP to see. If old URLs that have been 301'd exist, then it hasn't been recrawled and updated.
Once we know whether its an issue of them not having been recrawled, you'll better understand what the possible issue is and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO issues with masking blog domain?
We have a client who would like to move their Wordpress blog into a different server from their main site's server for security reasons. However, the blog is almost 10 years old with good traffic and rankings and we'd rather not have them change the domain. The developer has come back with a URL "masking" rule in .htaccess that will display the contents of the blog placed in the new server under a subdomain but still show the blog's original URL. If we block the new subdomain from indexing to avoid duplicate content - are there any SEO implications for doing this? Will Google see it as a deceptive practice and tank the blog's rankings? Any advice is greatly appreciated.
Intermediate & Advanced SEO | | roundabout0 -
Do I need to do 301s in this situation?
We have an e-commerce site built on Magento 2. We launched a few months ago, and had about 2K categories. The categories got indexed in Google for the most part. Shortly after launch, we decided to go with SLI for search and navigation because the native search/navigation was too slow given our database. The navigation pages are now hosted navigation pages; meaning, the URLs have changed and they are now hosted by SLI. I have done 301s for the most popular categories, but I didn't do 301s for all categories as we have to go through each category one-by-one and map it to the correct navigation page. Our new category sitemap only lists the new SLI category URLs. Will the fact that we have not 301'd all of our former categories hurt us as far as SEO? Do I have to do 301 redirects for all former category pages?
Intermediate & Advanced SEO | | kevin_h0 -
Same product in different categories and duplicate content issues
Hi,I have some questions related to duplicate content on e-commerce websites. 1)If a single product goes to multiple categories (eg. A black elegant dress could be listed in two categories like "black dresses" and "elegant dresses") is it considered duplicate content even if the product url is unique? e.g www.website.com/black-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/elegant-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/black-elegant-dress unique url > this is the way my products urls look like Does google perceive this as duplicated content? The path to the content is only one, so it shouldn't be seen as duplicated content, though the product is repeated in different categories.This is the most important concern I actually have. It is a small thing but if I set this wrong all website would be affected and thus penalised, so I need to know how I can handle it. 2- I am using wordpress + woocommerce. The website is built with categories and subcategories. When I create a product in the product page backend is it advisable to select just the lowest subcategory or is it better to select both main category and subcategory in which the product belongs? I usually select the subcategory alone. Looking forward to your reply and suggestions. thanks
Intermediate & Advanced SEO | | cinzia091 -
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Including FAQ as Invividual Blog Posts Without Duplicate Issues
My website's FAQ section has a lot of detailed answers, of which I want to upload most on an individual basis to my blog. Example: I may have 30 FAQ and I want to upload 28 of these FAQ as individual blog posts, as it could be good additional search traffic. Question: how do I deal with duplicate content issues? Do I Include canonical? The FAQ are all on the same URL - not separate URL's - which means each blog post would only represent a small % of the entire FAQ section, though each blog would be a 100% copy of an FAQ.
Intermediate & Advanced SEO | | khi51 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0