How to recover after blocking all the search engine spiders?
-
I have the following problem - one of my clients (a Danish home improvement company) decided to block all the international traffic (leaving only Scandiavian one), because they were getting a lot of spammers using their mail form to send e-mails.
As you can guess this lead to blocking Google also since the servers of Google Denmark are located in the US. This lead to drop in their rankings. So my question is - What Shall I do now - wait or contact Google?
Any help will be appreciated, because to be honest I had never see such thing in action until now
Best Regards
-
I really apprciate your help Thanks for the fast reply. I really hope this is the last time they try such techniques, because it is kind of frustrating
-
One of our clients did this once. We restored the robots.txt to original version allowing robots to browse normally and pointed a few fresh links to the site. The site was back in the index within a week. If your site has thousands or millions of pages that may take a bit longer. We found that discovery through links leads to better and quicker indexation than when you simply submit to Google. Get Google Webmaster Tools account and ensure that Google has clear picture of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I authenticate a script with Search Console API to pull data
In regards to this article, https://moz.com/blog/how-to-get-search-console-data-api-python I've gotten all the way to the part where I need to authenticate the script to run. I give access to GSC and the local host code comes up. In the article, it says to grab the portion between = and #, but that doesnt seem to be the case anymore. This is what comes up in the browser http://localhost/?code=4/igAqIfNQFWkpKyK6c0im0Eop9soZiztnftEcorzcr3vOnad6iyhdo3DnDT1-3YFtvoG3BgHko4n1adndpLqjXEE&scope=https://www.googleapis.com/auth/webmasters.readonly When I put portions of it in, it always comes back with an error. Help!
Technical SEO | | Cnvrt0 -
Blocking subdomains with Robots.txt file
We noticed that Google is indexing our pre-production site ibweb.prod.interstatebatteries.com in addition to indexing our main site interstatebatteries.com. Can you all help shed some light on the proper way to no-index our pre-prod site without impacting our live site?
Technical SEO | | paulwatley0 -
"Url blocked by robots.txt." on my Video Sitemap
I'm getting a warning about "Url blocked by robots.txt." on my video sitemap - but just for youtube videos? Has anyone else encountered this issue, and how did you fix it if so?! Thanks, J
Technical SEO | | Critical_Mass0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
Similar Websites, Same C Block: Can I Get a Penalty?
One of my website has been heavily hit by Google's entire zoo so I decided to phase it out while building a new one. Old website: www.thewebhostinghero.com
Technical SEO | | sbrault74
New website: www.webhostinghero.com Now the thing is that both websites are obviously similar since I kept the branding. They also both have content about the same topics. No content has been copied or spinned or whatever though. Everything's original on both websites. There were only 3 parts of both websites that were too similar in terms of functionalities so I "noindexed" it on the old website. Now it seems that Google doesn't want you to have multiple websites for the same business just for the sake of occupying more space in the search results. This can especially be detected by the websites' C block. I am not sure if this is myth or fact though. So do you think I'm in a problematic situation with this scenario? It's getting ridiculous all you have to watch for when building a website, I'm afraid to touch my keyboard in fear my websites will get penalized! Sorry for my english btw.0 -
Tool to search relative vs absolute internal links
I'm preparing for a site migration from a .co.uk to a .com and I want to ensure all internal links are updated to point to the new primary domain. What tool can I use to check internal links as some are relative and others are absolute so I need to update them all to relative.
Technical SEO | | Lindsay_D0 -
Search/Search Results Page & Duplicate Content
If you have a page whose only purpose is to allow searches and the search results can be generated by any keyword entered, should all those search result urls be no index or rel canonical? Thanks.
Technical SEO | | cakelady0 -
Different pages first results on same keyword search
Hi, Sometimes Google does not show the page you intended for a certain keyword. Logically you would say that the intended page is not relevant/strong enough. But in my case several pages ranked fine for a long period of time and all of a sudden another less important page gets the highest result on a keyword search. (We are in the camping business) For instance: One of our campsites called Tenuta primero used to rank position 9 in google with page below for a long time (search: 'camping tenuta primero') This was the page we intended to rank with. http://www.suncamp.nl/nl/nl/campings/italie/friuli-venezia-giulia/camping-tenuta-primero/uc19-l1-n797-c13-r115-cp104959/ Now all of a sudden the position for search 'camping tenuta primero' is position 33 with review page below. http://www.suncamp.nl/nl/nl/campings/italie/friuli-venezia-giulia/camping-tenuta-primero/beoordelingen/uc19-l1-n797-c13-r115-cp104959-t22598/ What could have caused this? Pages are in Dutch but main keywords are camping are tenuta primero. Thank you very much in advance! Kind regards, Dennis Overbeek Dennis@acsi.eu | www.suncamp.nl |
Technical SEO | | SEO_ACSI0