Should I deindex my pages?
-
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place.
Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
-
Glad to hear it! Yeah, patience isn't easy, that's for sure.
-
Good the hear it went well.
Dr Peat,
Had some great advice.
glad to know it's working.
Thomas
-
Thanks for replying. As it happens, I checked wmt this morning and found that all of the old pages had been deindexed on their own (with the exception of 2 that had not been 301'd - they've been sorted now) and there are now no duplicate title warnings. The site is also looking better in SERPs.
You are right, I just needed to wait a few weeks and show some patience. Its quite easy to get impatient when justifying our cost to a client.
Thanks again.
-
Just checking in - has the situation improved in the month since you posted the question? I tend to agree with Thomas that it's usually just a waiting game, assuming the 301-redirects are working properly. It never hurts to use a header checker, just in case (it's amazing how often redirects get implemented poorly).
You could re-create the old sitemap if the transition is stalled. I'd avoid actively removing the old URLs, as that could interfere with making sure the link equity from the old URLs passes to the new ones. The only issue would be if you suspect duplicate content problems are harming you.
The devil is in the details on these situations, and it depends a lot on the size of the site, etc.
-
I do not know how big your old site was however if you have just done this you may see some overflow as the website cache has yet to die away on the old site.
You can always use the way back machine or archive.com/.org I know it's one of the two
after that follow the guide in Moz along with using a tool like screaming spider frog SEO the free version covers 500 pages for free however the Pro version is required to do redirects but makes it so easy it's well worth it.
That Way you can find your old site the way it was. And what the names of links Used to be. However if you have correctly 301 redirected the old links to the new site. Meaning equivalent pages then this should not be an issue. You have to allow Google time to index your website as well as the index for your old site. Get rid of I would use the tool in Google Webmaster tools and tell Google that you have moved your website this will allow it to be indexed or for quickly. However it may be complicated to have that happen without a new domain.
It does not sound like a crisis to me. However only you know how you conducted the 301 redirects.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Homepage is deindexed in Google
We recently noticed that our primary page was de-indexed in Google. When looking in google search console there are no manual actions taken. We did add a few new banners to the site but I have no idea why this would have negatively affected that site. I did add a new page called https://enleaf.com/company/testimonials/ that had some duplicate testimonials that were also on the home page but have since removed that. Not sure where to go from here.
Technical SEO | | AChronister0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
Should I change my targeted page?
Currently I have a site where the targeted keywords were on the home page, with links built to the homepage. It has been widely recognised though that Google is looking more and more for specific content on webpages that holds greater relevance to search queries. As such, I switched this targeted page to other created webpages - changing metatags and creating more relevant content for respective keywords. I thought this would improve rankings, however, upon doing this there was a sharp fall in rankings for keywords. Is there anything that I could have done wrong, or can do better so that keywords move back up the rankings?
Technical SEO | | Gavo0 -
How to determine which pages are not indexed
Is there a way to determine which pages of a website are not being indexed by the search engines? I know Google Webmasters has a sitemap area where it tells you how many urls have been submitted and how many are indexed out of those submitted. However, it doesn't necessarily show which urls aren't being indexed.
Technical SEO | | priceseo1 -
Duplicate Page Titles
I had an issue where I was getting duplicate page titles for my index file. The following URLs were being viewed as duplicates: www.calusacrossinganimalhospital.com www.calusacrossinganimalhospital.com/index.html www.calusacrossinganimalhospital.com/ I tried many solutions, and came across the rel="canonical". So i placed the the following in my index.html: I did a crawl, and it seemed to correct the duplicate content. Now I have a new message, and just want to verify if this is bad for search engines, or if it is normal. Please view the attached image. i9G89.png
Technical SEO | | pixel830 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0