Should you use google url remover if older indexed pages are still being kept?
-
Hello,
A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap.
Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time.
Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this.
Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages.
Thank you in advance
-
Thanks!
Was hoping to recover a month ago but if google is still having these old urls cached, then they very well think we have them still, so hopefully once the urls are gone in google's brain we should see some improvement.
Thanks for clearing that up, hope this helps others in the same boat or somewhat close.
-
Just to clarify: Did check that there are no duplicate URL problems? Such as www and non-www.
In case that you've checked and everything is correct. Then, you're on the right path, I hope: Only 5pages with results, that makes 60 URL in SERPs.
Those that are only cached, will be gone.Hope it helps.
GR. -
Hello and Thanks!
How long ago did the redesign take place?
March 2016
All those pages that you want to be unindexed show any content?
No, in fact, when I search google site:domain.com I get 800 urls, but only page 1 - 5 have results, despite results showing 10 pages, so it's still cached but not the content, I believe?
These have been 404'd which is why I'm kinda at a loss as to whats going on.
-
Hi there!
How long ago did the redesign take place?
All those pages that you want to be unindexed show any content?In my experience, it took about 4 weeks to desapear from the index. But to get the pages taken down, those should be a 404.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing a site from Google index with no index met tags
Hi there! I wanted to remove a duplicated site from the google index. I've read that you can do this by removing the URL from Google Search console and, although I can't find it in Google Search console, Google keeps on showing the site on SERPs. So I wanted to add a "no index" meta tag to the code of the site however I've only found out how to do this for individual pages, can you do the same for a entire site? How can I do it? Thank you for your help in advance! L
Technical SEO | | Chris_Wright1 -
Single page website vs Google
Hi, I was wondering on this issue: There is a website for guesthouse. It has all information on one page (it is a valid page, with legitimate content). How google treats those pages? Would it treat it as Doorway Page? Or give some other penalties? What about a bounce rate? Because it will be pretty high, as there is no option to go somewhere else? What is your opinion on single page websites - SEO wise? Is it a shot in the foot? Thanks!
Technical SEO | | LeszekNowakowski0 -
Vanity URLs are being indexed in Google
We are currently using vanity URLs to track offline marketing, the vanity URL is structured as www.clientdomain.com/publication, this URL then is 302 redirected to the actual URL on the website not a custom landing page. The resulting redirected URL looks like: www.clientdomain.com/xyzpage?utm_source=print&utm_medium=print&utm_campaign=printcampaign. We have started to notice that some of the vanity URLs are being indexed in Google search. To prevent this from happening should we be using a 301 redirect instead of a 302 and will the Google index ignore the utm parameters in the URL that is being 301 redirect to? If not, any suggestions on how to handle? Thanks,
Technical SEO | | seogirl221 -
What's going on with google index - javascript and google bot
Hi all, Weird issue with one of my websites. The website URL: http://www.athletictrainers.myindustrytracker.com/ Let's take 2 diffrenet article pages from this website: 1st: http://www.athletictrainers.myindustrytracker.com/en/article/71232/ As you can see the page is indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:dfbzhHkl5K4J:www.athletictrainers.myindustrytracker.com/en/article/71232/10-minute-core-and-cardio&hl=en&strip=1 (that the "text only" version, indexed on May 19th) 2nd: http://www.athletictrainers.myindustrytracker.com/en/article/69811 As you can see the page isn't indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:KeU6-oViFkgJ:www.athletictrainers.myindustrytracker.com/en/article/69811&hl=en&strip=1 (that the "text only" version, indexed on May 21th) They both have the same code, and about the dates, there are pages that indexed before the 19th and they also problematic. Google can't read the content, he can read it when he wants to. Can you think what is the problem with that? I know that google can read JS and crawl our pages correctly, but it happens only with few pages and not all of them (as you can see above).
Technical SEO | | cobano0 -
How to stop my webmail pages not to be indexed on Google ??
when i did a search in google for Site:mywebsite.com , for a list of pages indexed. Surprisingly the following come up " Webmail - Login " Although this is associated with the domain , this is a completely different server , this the rackspace email server browser interface I am sure that there is nothing on the website that links or points to this.
Technical SEO | | UIPL
So why is Google indexing it ? & how do I get it out of there. I tried in webmaster tool but I could not , as it seems like a sub-domain. Any ideas ? Thanks Naresh Sadasivan0 -
How do you know what version of your site of Google is in their index?
This is going to sound like a strange question, but I am trying to understand which version of our site is in the index. You might think this is an obvious question, but here is why I am asking: 1. Today I searched for a specific keyword and found the listing. 2. I liked on the right arrow next to the listing and checked the cache date. It says 6/28 and shows the site as of 6/28. 3. I expected to see that we were just indexed as we jumped several pages since yesterday and I had just checked two days ago and we hadn't moved at all. It seems like Google may have taken the changes we made on 7/2 but since it is showing 6/28, I am note sure. Since this is confusing, here is the chronology: 1. Made changes 6/20. 2. Site appeared to be indexed on 6/28. 3. Made changes on 7/2. 4. Checked the site on 7/2 and we were in position 60. Checked the site on 7/4 and we were in position 61. 5.. Checked the site today (7/6) and see we are in position 8. The cache date shows as 6/28. I suspect that Google just indexed us yesterday and is reflecting the changes I made on 7/2. But the fact that it says it was cached on 6/28 seems to sugges otherwise. I want to be sure I know which version got us the good rankings - is there any way to be sure? Thanks!!
Technical SEO | | trophycentraltrophiesandawards0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0