Stuck trying to deindex pages from google
-
Hi There,
We had developers put a lot of spammy markups in one of our websites. We tried many ways to deindex them by fixing it and requesting recrawls... However, some of the URLs that had these spammy markups were incorrect URLs - redirected to the right version, (ex. same URL with or without / at the end)
so now all the regular URLs are updated and clean, however, the redirected URLs can't be found in crawls so they weren't updated, and couldn't get the spam removed. They still show up in the serp.
I tried deindexing those spammed pages by making then no-index in the robot.txt file. This seemed to be working for about a week, and now they showed up again in the serp
Can you help us get rid of these spammy urls?
-
Ruchy,
Yeap it might had helped for a few weeks. But internal links from your site are not the only way to crawl all your pages. Remember that there may be other sites linking other pages.
B- Absolutely, adding noindex will help. There is no way to know for sure how long will it take, give it a few weeks. Also, it could help removing manually all those pages with the Google Search Console, as Logan said.
Hope it helps!.
GR -
Hi Gaston,
Thanks so much for taking your time to answer my question
here are two points - A- My mistake, in the robot.txt we disallowed it, and it was done right. Our devs did it for us and I double checked in in search console tester. Also, this idea did work for us the first few weeks.
B - There is no place the crawlers can find these pages to recrawl, as they are no longer linked from anywhere in my site. will adding the no index help? If yes, how long can it take?
-
I second what Gaston said. This usage of robots.txt is one of the most common misconceptions in SEO, so don't feel bad. Google actually explicitly says to not use robots.txt for index-prevention in their webmaster guide.
To add to Gaston's point, make sure you remove the robots.txt disallow when you add the meta noindex tag he provided. If you don't let them crawl the page, they won't see the tag.
You can also use remove these URLs temporarily in Search Console by going to the Google Index menu and selecting "Remove URLs". It'll remove from search results, then when they come back to crawl that page again (as long as you're letting them), they'll see your noindex tag and keep it out.
-
Hello Ruchy,
If by "making no-index" in the robots you are meaning _disallowing _them, you are making ir wrong.
Robots.txt are just signs to the robots and only tell them to NOT CRAWL them, it doesnt prevent from indexing those pages. (it can happen the case that there is a link pointing to that page and the crawler just passes by it).The most used way to remove certaing indexed pages is by adding the robots noindex meta tag, it should look like this:
Also, some useful links:
Robots meta directives - Moz
Robots meta tag - Google developers
Robots tag generatorHope it helps.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Home Page Deindexed Only at Google after Recovering from Hack Attack
Hello, Facing a Strange issue, wordpress blog hghscience[dot]com was hacked by someone, when checked, I found index.php file was changed & it was showing some page with a hacked message, & also index.html file was added to the cpanel account.All pages were showing same message, when I found it, I replaced index.php to default wordpress index.php file & deleted index.htmlI could not find any other file which was looking suspicious. Site started working fine & it was also indexed but cached version was that hacked page. I used webmaster tool to fetch & render it as google bot & submitted for indexing. After that I noticed home page get deindexed by google. Rest all pages are indexing like before. Site was hacked around 30th July & I fixed it on 1st Aug. Since then home page is not getting indexed, I tried to fetch & index multiple time via google webmasters tool but no luck as of now. 1 More thing I Noticed, When I used info:mysite.com on google, its showing some other hacked site ( www.whatsmyreferer.com/ ) When Searching from India But when same info:mysite.com is searched from US a different hacked site is showing ( sigaretamogilev.by )However when I search "mysite.com" my site home page is appearing on google search but when I check cached URL its showing hacked sites mentioned above.As per my knowledge I checked all SEO Plugins, Codes of homepage, can't find anything which is not letting the homepage indexed.PS: webmaster tool has received no warning etc for penalty or malware. I also noticed I disallowed index.php file via robots.txt earlier but now I even removed that. 7Dj1Q0w.png 3krfp9K.png
Technical SEO | | killthebillion0 -
Google Webmaster Warning for Non-mobile Optimized Pages
I just received a warning in Webmaster Tools that my site pages are not optimized for mobile devices and the search results for pages will be decreased in mobile searches. Just got it yesterday and I see no drop yet, but anyone else seen this???? The notice states that the site is not optimized for viewport, text size and proper space for clickable elements. All of that is true since we have not yet completed our responsive design conversion. Any idea if Google will give us a little time to get this resolved, or do my rankings start dropping right away? Just when you think you are moving forward with Google, they pull the rug out again...
Technical SEO | | gametv0 -
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0 -
Does Google take into consideration the number of ad tracking pixels on a page into its ranking algo?
Does Google take into consideration the number of ad tracking pixels on a page into its ranking algo?
Technical SEO | | CLee-1779961 -
Google dropping pages from SERPS
The website for my London based plumbing company has thousands of specifically tailored pages for the various services we provide to all the areas in London. It equates to approximately 6000 pages in total. When google has all these pages indexed, we tend to get a fair bit of traffic - as they cater pretty well for long tail searches. However, every once in a while Google will drop the vast majority of our indexed pages from SERPs for a few days or weeks at a time - for example at the moment Google is only indexing 613 whereas last week it was back at the normal ~6000. Why does this happen? We of course lose a lot of organic traffic when these pages don't displayed - what are we doing wrong? Website: www.pgs-plumbers.co.uk
Technical SEO | | guy_andrews0