404'd pages still in index
-
I recently launched a site and shortly after performed a URL rewrite (not the greatest idea, i know). The developer 404'd the old pages instead of a permanent 301 redirect. This caused a mess in the index. I have tried to use Google's removal tool to remove these URL's from the index. These pages were being removed but now I am finding them in the index as just URL's to the 404'd page (i.e. no title tag or meta description). Should I wait this out or now go back and 301 redirect the old URL's (that are 404'd now) to the new URL's? I am sure this is the reason for my lack of ranking as the rest of my site is pretty well optimized and I have some quality links.
-
Will do. Thanks for the help.
-
I think the latter - robot and 301.
but (if you can) leave a couple without 301 and see what (if any) difference you get - would love to hear how it works out.
-
Is it better to remove the robots.txt entries that are specific to the old URL's so Google can see the 404 so Google will remove those pages at their own pace or remove those bits of the robots.txt file specific to the old URL's and 301 them to the new URL's. It seems those are my two options....? Obviously, I want to do what is best for the site's rankings and will see the fastest turnaround. Thanks for your help on this by the way!
-
I'm not saying remove the whole robots.txt file - just the bits relating to the old urls (if you have entries in a robots.txt that affect the old urls).
e.g. say you're robots.txt blocks access to
then you should remove that line from the robots.txt otherwise google won't be able to crawl those pages to 'see' the 404 and realise that they're not there.
My guess is a few weeks before it all settles down, but that really is a finger in the air guess. I went through a similar scenario with moving urls and then moving them again shortly after the first move - took a month or two.
-
I am a little confused regarding removal of the robots.txt file since that is a step in requesting removal from google (per their removal tool requirements). My natural tendency is to 301 redirect the old URL's to the new ones. Will I need to remove the robots.txt file prior to permanently redirecting the old URL's to the new ones? How long does it take Google (estimate) to remove old URL's after a 301?
-
Ok, got that, so that sounds like an external rewrite - which is fine. url only, but no title or description - that sounds like what you get when you block crawling via robots.txt - if you've got that situation, I'd suggest removing the block so that google can crawl them and find that they are 404s. Sounds like they'll fall out of the index eventually. Another thing you could try to hurry things along is: 301 the old urls to the new ones. submit a sitemap containing the old urls (so that they get crawled and the 301s are picked up) update your sitemap and resubmit with only the new urls.
-
When I say URL rewrite, I mean we restructured the URL's to be cleaner and more search friendly. For example, take a URL that was www.example.com/index/home/keyword and structure it to be www.example.com/keyword. Also, the old URL's (i.e. www.example.com/index/home/keyword) are being shows towards the end of the site:example.com search with just the old URL - no title or meta description. Is this a sign that they are on the way out of the index? Any insight would be helpful.
-
Couple of things probably need clarifying: When you say URL rewrite, I'm assuming you mean an external rewrite (in effect, a redirect)? If you do an internal rewrite, that (of itself) should make no difference at all to how any external visitors/engines see your urls/pages. If the old pages had links or traffic I would be inclined to 301 them to the new pages. If the old pages didn't have traffic/links, leave them, they'll fall out eventually - they're not in an xml sitemap by any chance are they (in which case update the sitemap). You often see a drop in rankings when restructuring a site and (in my experience), it can take a few weeks to recover. To give you an example, it took nearly two months for the non-www version of our site to disappear from the index after a similar move (and messing about with redirects).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Password Protected Page(s) Indexed
Hi, I am wondering if my website can get a penalty if some password protected pages are showing up when I search on google: site:www.example.com/sub-group/pass-word-protected-page That shows that my password protected page was indexed either before or after adding the password protection. I've seen people suggest no indexing the page. Is that the best method to take care of this? What if we are planning on pushing the page live later on? All of these pages have no title tag, meta description, image alt text, etc. Should I add them for each page? I am wondering what is the best step, especially if we are planning on pushing the page(s) live. Thanks for any help!
Intermediate & Advanced SEO | | aua0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Why isn't my uneven link flow among index pages causing uneven search traffic?
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
Intermediate & Advanced SEO | | GilReich0 -
How to fix Invalid Product Page registering as Soft 404
Somehow with our site architecture Google is crawling URLS for products we no longer carry (there are no links to those pages so I am still trying to figure out how Google is finding them).Those URLS are being redirected to our invalid product page. That invalid product page is returning a 200 OK code, but according to Google it should be a 404 so we get a soft 404 error. Google is seeing all of the URLs that redirect to that page as soft 404's as well. The first solution I can think of is to create a custom 404 page that looks just like our site, says we don't have the page/product they are looking for, has a search bar, sends a 404 code, etc. Is this the right way to go? And it will probably take some time to implement so is there a quick fix we could do first?
Intermediate & Advanced SEO | | ntsupply0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Why my own page is not indexed for that keyword?
hi, I recently recreated the page www.zenucchi.it /ITA/poltrona-frau-brescia.html on the third level domain poltronafraubrescia.zenucchi.it by putting it on the home page. The first page is still indexed for the keyword poltrona frau brescia . But the new page is no indexed for that keyword and i don't know why ( even if the page is indexed in google ) .. I state that the new domain has the same autorithy and that i put a 301 redirect to pass his authority to the new one that has many more incoming links that did not have previous .. i hope you'll help me thanks a lot
Intermediate & Advanced SEO | | guidoboem0 -
How long till pages drop out of the index
In your experience how long does it normally take for 301-redirected pages to drop out of Google's index?
Intermediate & Advanced SEO | | bjalc20110 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0