Disallow: /search/ in robots but soft 404s are still showing in GWT and Google search?
-
Hi guys, I've already added the following syntax in robots.txt to prevent search engines in crawling dynamic pages produce by my website's search feature: Disallow: /search/. But soft 404s are still showing in Google Webmaster Tools. Do I need to wait(it's been almost a week since I've added the following syntax in my robots.txt)? Thanks, JC
-
You could also look at using the meta robots = noindex tag on /search/ pages, rather than just blocking it in robots.txt, as this will remove existing URLs from the index.
-
Glad to help
-
Thanks a lot Dan!
-
That is a good recommendation but ultimately search engines will make a final decision on crawl frequency. Take a look at your 'Crawl Stats' on GWTs and this will give you an idea of how often your site is crawled.
-
Is the time issue related in crawl frequency of the URLs in my sitemap?
Thanks Dan, appreciate it.
-
You will probably need to wait a little longer - it depends how often your site usually gets crawled and indexed.
However, robots.txt does not always stop search engines from indexing your pages. It will stop them crawling a page on your site but it tells them that they can still index that page. If they find links from external sites then the URL may still appear in the SERP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Enabling Podcast for Search / Structured Data
Hi, I'm trying to configure a podcast to show up in search using these guidelines and need help identifying which code to use per these instructions. https://developers.google.com/search/docs/data-types/podcast Out of this, we put the following code inside the header tag of a designated podcast page, and it doesn't seem to be rendering properly when I test it. For this podcast home page: https://www.thepitchqueen.com/podcast-success-unfiltered/ href="http://successunfiltered.libsyn.com/rss"/> <title>Success Unfiltered Podcast</title>Any ideas about what to do? Or if this is correct, let me know?
Technical SEO | | HiddenPeak0 -
My some pages are not showing cached in Google, WHY?
I have website http://www.vipcollisionlv.com/ and when i check the cache status with tags **site:http:vipcollisionlv.com, **some page has no cache status.. you can see this in image. How to resolve this issue. please help me.
Technical SEO | | 1akal0 -
How to avoid instead suggestion from Google search results ?
Hi, When I search for "Zotey" in google, the following message is being displayed. Showing results for zotye
Technical SEO | | segistics
Search instead for zotey Anyone let me know how to get rid of this conflict asap? Regards, Sivakumar.0 -
Inurl: search shows results without keyword in URL
Hi there, While doing some research on the indexation status of a client I ran into something unexpected. I have my hypothesis on what might be happing, but would like a second opinion on this. The query 'site:example.org inurl:index.php' returns about 18.000 results. However, when I hover my mouse of these results, no index.php shows up in the URL. So, Google seems to think these (then duplicate content) URLs still exist, but a 301 has changed the actual goal URL? A similar things happens for inurl:page. In fact, all the 'index.php' and 'page' parameters were removed over a year back, so there in fact shouldn't be any of those left in the index by now. The dates next to the search results are 2005, 2008, etc. (i.e. far before 2013). These dates accurately reflect the times these forums topic were created. Long story short: are these ~30.000 'phantom URLs' in the index out of total of ~100.000 indexed pages hurting the search rankings in some way? What do you suggest to get them out? Submitting a 100% coverage sitemap (just a few days back) doesn't seem to have any effect on these phantom results (yet).
Technical SEO | | Theo-NL0 -
My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
Technical SEO | | jampaper0 -
Some competitors have a thumbnail in Google search results
I've noticed that a few of my top competitors have a small photo (thumbnail) next to their listing. I'm sure it's not a coincidence that they are ranked top for the search phrase too. Is this really a help and how can it be done? Many thanks, Iain.
Technical SEO | | iainmoran0 -
Robots.txt Showing in SERP Results
Currently doing a technical audit for a website and when I search "Site:website.com -www" the only result is website.com/robots.txt I was wondering if anyone else has come across this before -- or what this may mean from a technical audit standpoint. Thank you!
Technical SEO | | vectormedia0 -
"Site Suspended" in Google Adwords + Lost all rankings in Google => is this related?
Can anyone share thoughts on this: Does the S recently (mid april) we revamped our website (same content, new layout, strong brand), but a few days later our google rep contacted us to tell that she got a "red flag" for one of our SEA campaigns (we broke the bridge page policy, not on purpose to be clear), they were completely correct on this matter. We even got some extra time to correct this, normal policy is only 10 days. But, we were a little slow, so all our Adwords Campaigns are suspended and we get the message "Site suspended". We are working to have this fixed, our Google rep even granted some more time to fix this. Now, almost simultaneously, same time frame, all our new pages, that were already ranking well tx to proper 301 rules, suddenly fell out of the google SERPS, nothing to be found anymore up till now. Our website is live since 1996, no issues, up till now. There seems to be a strong correlation to what happened in our SEA and what happened in our SEO can anyone share some info?
Technical SEO | | TruvoDirectories0