Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
-
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there.
How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped.
Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
-
Google Search Console actually has a URL removal tool built into it, unfortunately it's not really scaleable (mostly it's one at a time submissions) and in addition to that the effect of using the tool is only temporary (the URLs come back again)
In your case I reckon' that changing the status code of the 'gone' URLs from 404 ("temporarily not found, but will be returning soon") to 410 ("GONE!") might be a good idea. Google might digest that better as it's a harder indexation directive and a very strong crawl directive ("go away, don't come back!")
You could also serve the Meta no-index directive on those URLs. Obviously you're unlikely to have access to the HTML of non-existent pages, but did you know Meta no-index can also be fired through x-robots, through the HTTP header? So it's not impossible
https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404
(Ctrl+F for "X-Robots-Tag HTTP header")
Another option is this form to let Google know outdated content is gone, has been removed, and isn't coming back:
https://www.google.com/webmasters/tools/removals
... but again, URLs one at a time is going to be mega-slow. It does work pretty well though (at least in my experience)
In any eventuality I think you're looking at, a week or two for Google to start noticing in a way that you can see visually - and then maybe a month or two until it rights itself (caveat: it's different for all sites and URLs, it's variable)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you disallow links via Search Console?
Hey guys, Is it possible in anyway to nofollow links via search console (not disavow) but just nofollow external links pointing to your site? Cheers.
Intermediate & Advanced SEO | | lohardiu90 -
Should I Add Location to ALL of My Client's URLs?
Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
Intermediate & Advanced SEO | | pdrama231
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
How to maximize CTR from Google image search?
I'm getting good, solid growth in my Google SERPs and Google search traffic now, but I do notice that 70% of my high ranking search results are images and the CTR on those is only 3-4%. All my images are illustrative and highly relevant to my travel blog, but I guess that hardly matters unless they get CTR so people see them in context. Has anyone seen or done any good research on what makes people click through on Google Image Search results? What are the key factors? How do you optimize for click-through? Is it better to watermark your images or overlay label them to increase likelihood of click-through? Thanks, Tony FYI the travel blog in question is www.asiantraveltips.com and a relevant Google search where I rank highly is "songkran 2016 phuket".
Intermediate & Advanced SEO | | Gavin.Atkinson0 -
How to NOT appear in Google results in other countries?
I have ecommerce sites the only serve US and Canada. Is there a way to prevent a site from appearing in the Google results in foreign countries? The reason I ask is that we also have a lot of informational pages that folks in other countries are visiting, then leaving right after reading. This is making our overall Bounce Rate very high (64%). When we segment the GA data to look at just our US visitors, then the Bounce Rate drops a lot. (to 48%) Thanks!
Intermediate & Advanced SEO | | GregB1230 -
Google penguin penalty(s), please help
Hi MozFans, I have got a question out of the field about www.coloringpagesabc.com.
Intermediate & Advanced SEO | | MaartenvandenBos
Question is why the rankings and traffic are going down down down the last 4 months. Costumer thinks he got hit by google penguin update(s). The site has about 600 page’s/posts al ‘optimized’ for old seo:
- Almost all posts are superb optimized for one keyword combination (like … coloring pages) there is a high keyword density on the keyword titles and descriptions are all the same like: <keyword>and this is the rest of my title, This is my description <keyword>and i like it internal linking is all with a ‘perfect’ keyword anchor text there is a ok backlink profile, not much links to inner pages
- there are social signals the content quality is low The site to me looks like a seo over optimized content farm Competition:
When I look at the competition. The most coloring pages websites don’t offer a lot of content (text) on there page. The offer a small text and the coloring pages (What it is about :-)) How to get the rankings back:
What I was thinking to do. rewrite the content to a smaller text. Low keyword density on the keyword and put the coloring pages up front. rewrite all titles and descriptions to unique titles and descriptions Make some internal links to related posts with a other anchor text. get linkbuilding going on inner pages get more social signals Am I on the right track? I can use some advise what to do, and where to start. Thanks!!</keyword></keyword> Maarten0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0