Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
301s being indexed
-
A client website was moved about six months ago to a new domain. At the time of the move, 301 redirects were setup from the pages on the old domain to point to the same page on the new domain. New pages were setup on the old domain for a different purpose. Now almost six months later when I do a query in google on the old domain like site:example.com 80% of the pages returned are 301 redirects to the new domain. I would have expected this to go away by now. I tried removing these URLs in webmaster tools but the removal requests expire and the URLs come back. Is this something we should be concerned with?
-
Hi,
This is completely normal at the moment. Many 301 URLs stay in the index for 6-12 months.
Case in point, google this:

There isn't anything you can do. Verify your 301s are set-up correctly. Move on.
-
Hi there,
Have you run a crawl on your site to see if there are a lot of links pointing to the old URLs? If Google sees more links point to the old version of the URLs rather than the new version, it's possible that it thinks that the old pages aren't really gone for good.
- Kristina
-
Hi,
Thanks for your responses. There are no issues with robots or canonical tags that are apparent. The 301 redirects are accessible by Googlebot, I checked in Webmaster Tools. And the page that the 301 redirects to on the other domain has a canonical tag set to the proper URL (itself).
-
Hi IrvCo_Interactive,
I'd recommend digging in to the pages being 301 redirected to make sure there are no conflicting directives, e.g. a rel="canonical" tag pointing to another page on the old domain. I've seen this issue of conflicting directives affecting indexation before and wrote about it here: http://upstreamist.co/indexation-canonical-greater-than-301/
If there are no existing conflicting directives, it may be worth trying the canonical tag on top of the 301 redirect at least for a few pages to see if the canonical tag is more effective in removing the page from the index.
-Trung
-
If it's six months old - they yes - it might be a reason for concern as users might be set to the old domain. Can you check and see if you are blocking with robots.txt the old domain some how ? Since if that's the case the bot can't reach the old pages and see the redirection and if those pages are already in the index they will stay that way.
Alternatively check the logs and see if google bot did hit those pages in the last 6 mo - although I doubt it didn't - it's safe to check.
Cheers !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I index resource submission forms, thank you pages, etc.?
Should I index resource submission forms, thank you, event pages, etc.? Doesn't Google consider this content too thin?
Intermediate & Advanced SEO | | amarieyoussef0 -
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
My url disappeared from Google but Search Console shows indexed. This url has been indexed for more than a year. Please help!
Super weird problem that I can't solve for last 5 hours. One of my urls: https://www.dcacar.com/lax-car-service.html Has been indexed for more than a year and also has an AMP version, few hours ago I realized that it had disappeared from serps. We were ranking on page 1 for several key terms. When I perform a search "site:dcacar.com " the url is no where to be found on all 5 pages. But when I check my Google Console it shows as indexed I requested to index again but nothing changed. All other 50 or so urls are not effected at all, this is the only url that has gone missing can someone solve this mystery for me please. Thanks a lot in advance.
Intermediate & Advanced SEO | | Davit19850 -
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
Google indexing pages from chrome history ?
We have pages that are not linked from site yet they are indexed in Google. It could be possible if Google got these pages from browser. Does Google takes data from chrome?
Intermediate & Advanced SEO | | vivekrathore0 -
Images Not Indexing? (Nudity Warning!) - Before & After Photos
One of our clients is in the Cosmetic Surgery business (bodevolve.com) and individuals most likely to purchase a cosmetic procedure only search for 2 things....'**before & after photos' and 'cost'. ** That being said we've worked extremely hard to optimize all 500+ before and after photos. And to our great disappointment, they still aren't being indexed...we are testing a few things but any feedback would be greatly appreciated! All photos are in the 'attachment' sitemap: http://bodevolve.com/sitemap_index.xml I'm also testing a few squeeze pages like this one: http://bodevolve.com/tummy-tuck-before-and-after-photos/ Thanks so much, Brit
Intermediate & Advanced SEO | | BritneyMuller0 -
Infinite Scrolling: how to index all pictures
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
Intermediate & Advanced SEO | | khi5
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/ thank you0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0