Removing indexed pages
-
Hi all, this is my first post so be kind
- I have a one page Wordpress site that has the Yoast plugin installed. Unfortunately, when I first submitted the site's XML sitemap to the Google Search Console, I didn't check the Yoast settings and it submitted some example files from a theme demo I was using. These got indexed, which is a pain, so now I am trying to remove them. Originally I did a bunch of 301's but that didn't remove them from (at least not after about a month) - so now I have set up 410's - These also seem to not be working and I am wondering if it is because I re-submitted the sitemap with only the index page on it (as it is just a single page site) could that have now stopped Google indexing the original pages to actually see the 410's?
Thanks in advance for any suggestions. -
Thanks for all the responses!
At the moment I am serving the 410's using the .htaaccess file as I removed the actual pages a while ago. The pages don't show in most searches, however, two of them do show up in some instances under the sitelinks which is the main pain. I manually asked for them to be removed using 'remove urls' however that only last a couple of months and they are now back.
So I guess the best way is to recreate the pages and insert a noindex?
Thanks again for everyone time, it's much appreciated.
-
I agree with ViviCa1's methods, so go with that.
One thing I just wanted to bring up though, is that unless people are actually visiting those pages you don't want indexed, or it does some type of brand damage, then you don't really need to make it a priority.
Just because they're indexed doesn't mean they're showing up for any searches - and most likely they aren't - so people will realistically never see them. And if you only have a one-page site, you're not wasting much crawl budget on those.
I just bring this up since sometimes we (I'm guilty of it too) can get bogged down by small distractions in SEO that don't really help much, when we should be creating and producing new things!
"These also seem to not be working and I am wondering if it is because I re-submitted the sitemap with only the index page on it (as it is just a single page site) could that have now stopped Google indexing the original pages to actually see the 410's?"
There was a good related response from Google employee Susan Moskwa:
“The best way to stop Googlebot from crawling URLs that it has discovered in the past is to make those URLs (such as your old Sitemaps) 404. After seeing that a URL repeatedly 404s, we stop crawling it. And after we stop crawling a Sitemap, it should drop out of your "All Sitemaps" tab.”
A bit older, but shows how Google discovers URLs through the sitemap. Take a look at the rest of that thread as well.
-
I'd suggest adding a noindex robots meta tag to the affected pages (see how to do this here: https://support.google.com/webmasters/answer/93710?hl=en) and until Google recrawls use the remove URLs tool (see how to use this here: https://support.google.com/webmasters/answer/1663419?hl=en).
If you use the noindex robots meta tag, don't disallow the pages through your robots.txt or Google won't even see the tag. Disallowing Google from crawling a page doesn't mean it won't be indexed (or removed from the index), it just means Google won't crawl the page.
-
Couple of ideas spring to mind
- Use the robots.txt file
- Demote the site link in Google search console (see https://support.google.com/webmasters/answer/47334)
Example of robots.txt file...
Disallow: /the-link/you-dont/want-to-show.html
Disallow: /the-link/you-dont/want-to-show2.htmlDon't include the domain just the link to the page, Plenty of tutorials out there worthwhile having a look at http://www.robotstxt.org
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issues with getting a web page indexed
Hello friends, I am finding it difficult to get the following page indexed on search: http://www.niyati.sg/mobile-app-cost.htm It was uploaded over two weeks back. For indexing and trouble shooting, we have already done the following activities: The page is hyperlinked from the site's inner pages and few external websites and Google+ Submitted to Google (through the Submit URL option) Used the 'Fetch and Render' and 'Submit to index' options on Search Console (WMT) Added the URL on both HTML and XML Sitemaps Checked for any crawl errors or Google penalty (page and site level) on Search Console Checked Meta tags, Robots.txt and .htaccess files for any blocking Any idea what may have gone wrong? Thanks in advance!
Technical SEO | | RameshNair
Ramesh Nair0 -
Why my website does not index?
I made some changes in my website after that I try webmaster tool FETCH AS GOOGLE but this is 2nd day and my new pages does not index www. astrologersktantrik .com
Technical SEO | | ramansaab0 -
Indexing pages content that is not needed
Hi All, I have a site that has articles and a side block that shows interesting articles in a column block. While we google for a keyword i can see the page but the meta description is picked from the side block "interesting articles" and not the actual article in the page. How can i deny indexing that block alone Thanks
Technical SEO | | jomin740 -
Post Site Migration - thousands of indexed pages, 4 months after
Hi all, Believe me. I think I've already tried and googled for every possible question that I have. This one is very frustrating – I have the following old domain – fancydiamonds dot net. We built a new site – Leibish dot com and done everything by the book: Individual 301 redirects for all the pages. Change of address via the GWT. Trying to maintain and improve the old optimization and hierarchy. 4 months after the site migration – we still have to gain back more than 50% of our original organic traffic (17,000 vs. 35,500-50,000 The thing that strikes me the most that you can still find 2400 indexed pages on Google (they all have 301 redirects). And more than this – if you'll search for the old domain name on Google – fancydiamonds dot net you'll find the old domain! Something is not right here, but I have no explanation why these pages still exist. Any help will be highly appreciated. Thanks!
Technical SEO | | skifr0 -
URL removals
Hello there, I found out that some pages of the site have two different URL's pointing at the same page generating duplicate content, title and description. Is there a way to block one of them? cheers
Technical SEO | | PremioOscar0 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0 -
Directory Indexed in Google, that I dont want, How to remove?
Hi One of my own websites, having a slight issue, Google have indexed over 500+ pages and files from a template directory from my eCommerce website. In google webmaster tools, getting over 580 crawl errors mostly these ones below I went into my robots text file and added Disallow: /skins*
Technical SEO | | rfksolutionsltd
Disallow: /skin1* Will this block Google from searching them again? and how do I go about getting the 500 pages that are already indexed taken out? Any help would be great | http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscription_priceincart.tpl | 403 error | Jan 15, 2012 |
|http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscription_info_inlist.tpl | 403 error | Jan 15, 2012 |
|http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscriptions_admin.tpl | 403 error | Jan 15, 2012 |
0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0