Disappeared from Google with in 2 hours of webmaster tools error
-
Hey Guys
I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it
he made the below changes and within 2 hours the site has drop off the face of google
“in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed”
“I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools”
I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex
do you guys have any further advice ?
Ben
-
Thanks for the responses guys , it was picked back up in around 4 hours and lost no rank thankfully orders crashed but are back to normal now ! I'm going to investigate the two versions of the site is a bit strange
again thanks for your help
-
Hi Ben,
It's now 2 days after your original question, and it looks like your back in the SERPs, at least from what I can tell. Hopefully you've made a full recovery.
It's difficult to understand exactly what damage was done by your dev in Webmaster Tools, but it's reasonable to assume that whatever it was caused the error.
One thing I did notice is that both the https and http version of your site resolve.
http://www.freestylextreme.com/ &
https://www.freestylextreme.com/Ideally, one would redirect to the other, or at a minimum have a rel canonical tag in place so that only one version is crawled.
I'd go ahead a put your robots.txt file back in place, and check yourself with Google webmaster tools to make sure everything is okay.
Best of luck!
-
Hi Ben
I understand the panic in such a situation. I truly do. I checked your website and the fact that you do not have a robots.txt. You also do not have any kind of noindex or anything like that in your code and you look like a strong established website.
Do you have an XML sitemap in your webmaster console ? I would suggest you build one and submit it if you don't already have it.
Other then that, I would also suggest to have a robots.txt even if it's blank. Rather then a 404 redirecting to your homepage. Give me 24-48hrs and in my opinion, you should be back.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Audit Tools Not Picking Up Content Nor Does Google Cache
Hi Guys, Got a site I am working with on the Wix platform. However site audit tools such as Screaming Frog, Ryte and even Moz's onpage crawler show the pages having no content, despite them having 200 words+. Fetching the site as Google clearly shows the rendered page with content, however when I look at the Google cached pages, they also show just blank pages. I have had issues with nofollow, noindex on here, but it shows the meta tags correct, just 0 content. What would you look to diagnose? I am guessing some rogue JS but why wasn't this picked up on the "fetch as Google".
Technical SEO | | nezona0 -
Crawl errors - 2,513 not found. Response code 404
Hi,
Technical SEO | | JamesHancocks1
I've just inherited a website that I'll be looking after. I've looked in the Search Console in the Crawl errors section and discovered thousands of urls that point to non- existent pages on Desktop. There's 1,128 on Smartphone.
Some are odd and make no sense. for example: | bdfqgnnl-z3543-qh-i39634-imbbfuceonkqrihpbptd/ | Not sure why these have are occurring but what's the best way to deal with them to improve our SEO? | northeast/ | 404 | 8/29/18 |
| | 2 | blog/2016/06/27/top-tips-for-getting-started-with-the-new-computing-curriculum/ | 404 | 8/10/18 |
| | 3 | eastmidlands | 404 | 8/21/18 |
| | 4 | eastmidlands/partner-schools/pingle-school/ | 404 | 8/27/18 |
| | 5 | z3540-hyhyxmw-i18967-fr/ | 404 | 8/19/18 |
| | 6 | northeast/jobs/maths-teacher-4/ | 404 | 8/24/18 |
| | 7 | qfscmpp-z3539-i967-mw/ | 404 | 8/29/18 |
| | 8 | manchester/jobs/history-teacher/ | 404 | 8/5/18 |
| | 9 | eastmidlands/jobs/geography-teacher-4/ | 404 | 8/30/18 |
| | 10 | resources | 404 | 8/26/18 |
| | 11 | blog/2016/03/01/world-book-day-how-can-you-get-your-pupils-involved/ | 404 | 8/31/18 |
| | 12 | onxhtltpudgjhs-z3548-i4967-mnwacunkyaduobb/ | Cheers.
Thanks in advance,
James.0 -
Should I use the Google disavow tool?
Hi I'm a bit new to SEO and am looking for some guidance. Although there is no indication in Webmaster tools that my site is being penalised for bad links, I have noticed that I have over 200 spam links for "Pay Day Loans" pointing to my site. (This was due to a hack on my site several years ago). So my question is two fold. Firstly, is it normal to have spammy links pointing to your site and secondly, should I bother to do anything about it? I did some research into the Disavow tool in Webmaster tools wonder I should use it to block all these links. Thanks
Technical SEO | | hotchilidamo0 -
Webmaster Tools Search Queries Data Drop
Hi I'm seeing a significant drop in search queries being reported for a client in GWT starting on the 7th Feb. I have seen a few articles on SERound Table etc saying that many are reporting probs like delays etc with GWT updating its data, such as these ones: https://www.seroundtable.com/google-webmaster-tools-data-stalled-19854.html https://www.seroundtable.com/google-webmaster-tools-analytics-data-19870.html However these seem to suggest the problem is simply a delay with displayed data being updated, in the case im looking at the data is up to date but showing an increasing decline. When i look at Analytics data though the data is completely different. For exmaple GWT says on the 21st Feb there were 23 impressions with zero clicks but Analytics says there were 6 clicks/sessions from organic search. I take it this means that there is a likely problem with GWT data and I shouldn't worry ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Remove Directory In Webmaster Tools
Hey Moz'erz, I'm removing some URLS from the index and want to confirm the use of the "remove directory" request. If my structure is this: /blogs/customer-success-stories/tagged/ --- all pages that are /tagged/abc, /tagged/dce etc. will be removed correct? First time trying a directory removal as their are 100 plus of these tagged pages. Comments, suggestions and past experiences welcome!
Technical SEO | | paul-bold0 -
When you send disavow link in google webmaster?
I am just wondering if you disavow a link from google webmaster to a certain website. Does that hurt the other websites ranking at all? Thanks
Technical SEO | | EVERWORLD.ENTERTAIMENT0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0