I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
-
Good Afternoon,
I run a office equipment website called top4office.co.uk.
My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages.
Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages.
I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done.
But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700).
Does anyone have any ideas why this is happening and more importantly how i can fix it?
Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings?
Look forward to your responses!
-
I agree with DrPete. You cant have the pages within the robot.txt otherwise Google will not crawl the pages and "see" the 301s to then update the index.
Something else to consider is on the new pages, have them canonical to themselves. We had a site that Google was caching old URLs that had 301 redirects that had been up for 2 years. Google was finding the new pages and new titles and new content, but were referencing the old URLs. We were seeing this in the SERPs and also in the GWT. GWT was reporting duplicate content for titles and descriptions for sets of pages that were 301ed. Adding the canonical to self helped get that cleaned up.
Cheers.
-
This process can take a painfully long time, even done right, but I do have a couple of concerns:
(1) Assuming I understand the situation, I think using Robots.txt on top of 301-redirects is a bad idea. If Google doesn't recrawl the pages, they won't process the 301s, and Robots.txt is bad for removal (good for prevention, but not once something is in the index). Basically, you're telling Google not to re-crawl these pages, and if they don't re-crawl, they won't process the 301s. So, I'd drop the Robots.txt blocking for now, honestly.
(2) What's your internationalization strategy? You could potential try rel="alternate"/hreflang to specify US vs. UK English, target each domain in webmaster tools, and leave the duplicates alone. If you 301-redirect, you're not giving the UK site a chance to rank properly on Google.co.uk (if that's your objective).
-
It sounds like you have done pretty much everything you could do to remove those pages from Google, and that Google has removed them.
There are two possibilities that I can think of. First, Google is finding new pages or new URLs at least. These may be old pages that have some sort of a parameter on them or something like that that are causing Google to find some new pages even though you're not adding any new pages.
Another possibility is that, I found that the site:search is not entirely accurate. So, it's more like anything else that Google gives us words this kind of estimate of the actual figure. It's possible that Google was giving you a smaller number of pages if in that original 3700 they said they had. And now they're just reporting more of the pages that they had had in their index, which they weren't showing before.
By the way, when I do a search for site:top four office.co.uk, I only get 2600 results.
-
I no longer see the pages. No chance Google has seen any additional pages as we spend every day looking at new pages indexed by using the filter and site:top4office.co.uk.
Any ideas?
-
Just a quick question, do you see the URLs you "removed" still in the index? Or is it possible that Google has found a different set of 3000 URLs on your site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google does not index image sitemap
Hi, we put an image sitemap in the searchconsole/webmastertools http://www.sillasdepaseo.es/sillasdepaseo/sitemap-images.xml it contains only the indexed products and all images on the pages. We also claimed the CDN in the searchconsole http://media.sillasdepaseo.es/ It has been 2 weeks now, Google indexes the pages, but not the images. What can we do? Thanks in advance. Dieter Lang
Intermediate & Advanced SEO | | Storesco0 -
Links / Top Pages by Page Authority ==> pages shouldnt be there
I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre
Intermediate & Advanced SEO | | Neckermann0 -
Apps content Google indexation ?
I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂
Intermediate & Advanced SEO | | JoomGeek0 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
The Wrong Page Is Still The Only One Ranking
For some reason, the search term "Tampa Personal Injury Attorney," shows this page of our's http://www.kempruge.com/personal-injury/ on the second page, but omits this page http://www.kempruge.com/tampa-personal-injury-attorney/ (the correct one). The correct one only shows up in omitted results. I have posted this question before. I made the changes suggested to me, and it actually worked for a couple weeks. But, it reverted back. I tried for the last two months to fix this on my own, but I just can't figure it out. Does anyone have any idea what to do here? Incredibly appreciative of any assistance, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Removing content from Google's Indexes
Hello Mozers My client asked a very good question today. I didn't know the answer, hence this question. When you submit a 'Removing content for legal reasons report': https://support.google.com/legal/contact/lr_legalother?product=websearch will the person(s) owning the website containing this inflammatory content recieve any communication from Google? My clients have already had the offending URL removed by a court order which was sent to the offending company. However now the site has been relocated and the same content is glaring out at them (and their potential clients) with the title "Solicitors from Hell + Brand name" immediately under their SERPs entry. **I'm going to follow the advice of the forum and try to get the url removed via Googles report system as well as the reargard action of increasing my clients SERPs entries via Social + Content. ** However, I need to be able to firmly tell my clients the implications of submitting a report. They are worried that if they rock the boat this URL (with open access for reporting of complaints) will simply get more inflammatory)! By rocking the boat, I mean, Google informing the owners of this "Solicitors from Hell" site that they have been reported for "hosting defamatory" content. I'm hoping that Google wouldn't inform such a site, and that the only indicator would be an absence of visits. Is this the case or am I being too optimistic?
Intermediate & Advanced SEO | | catherine-2793880 -
Indexing specified entry pages
Hi,We are currently working on location based info.Basically, when someone searches from Florida they will get specific Florida results and when they search from California they will specific California results.How does this location based info affect crawling and indexing?Lets say we have location info for googlebot, sometimes they crawl from a New York ip address, sometimes they do it from Texas and sometimes from California. In this case google will index 3 different pages with 3 different prices and a bit different text, and I'm afraid they might see these as some kind of cloaking or suspicious movement because we serve different versions of the page. What's the best way to handle this?
Intermediate & Advanced SEO | | SEODinosaur0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0