Schoolboy error!
-
We ( I) blocked google by accident! using a robots text that had been on our test site...
To correct this I:
1 - created and uploaded a new sitemap and submitted
2 - fixed my robots text: http://www.camperbug.co.uk/robots.txt
The problem is, I try to fetch as googlebot in WMT but it says: Denied by robots.txt despite the fact it's fixed!
Does anyone have any ideas as to when google will reindex by robots text as WMT says I have critical health issues and i don't want keyword deindexing. Thanks!
-
Should take a day or so. Everyone who has done enough sites has made a mistakes like this (believe it or not, Google is eventually forgiving on most things). The tough part is waiting...
-
Just give it 24hrs, you should be good by same time tomorrow. As you said, you have already submitted the XML sitemap and also updated the robots.txt and that should be it.
-
You site can be crawl. Like everyone said, just wait for big G.
I did noticed a few 404 errors on your site. I have them on a cvs file. Here's the link
http://dl.dropbox.com/u/58098579/client_error_(4xx)_links.csv
-
Just did it today but the critical error in WMT is scary. Hopefully I don't get de-indexed as a result!
-
Sounds to me like you just need to wait until Google crawls your website again.
-
Has the new index been crawled? It usually takes about a day fr google to update your new sitemap. When did you resubmit your sitemap?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Homepage Advice
Hello, colleagues! We have a conundrum. A client website has a good subdirectory strategy for localized/translated content for its various international markets, but nothing currently "lives" at the root. In my mind, this presents a challenge to search engines (note that we have had some trouble getting proper visibility overall, which is why I'm asking this question). I'm looking for any links or just plain old good advice on why it's important to have a global homepage. Should that global homepage be in English? Most enterprise sites I've worked with do have a homepage that's in English, with the ability to select a country from a drop down in a nav across the site. Any advice, best practices, etc. about why a global homepage is important and what language it could/should be in would be really helpful. Hreflang tags would make sense, I guess, but each country has slightly different offerings so I'm not sure that it makes complete sense. In other words, one country's homepage may have completely different content than another's. Thank you!
Algorithm Updates | | SimpleSearch0 -
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
Is using REACT SEO friendly?
Hi Guys Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO? Many thanks for your help in advance. Cheers Martin
Algorithm Updates | | martin19700 -
Mobile Usability Issues after Mobile Frist
Hi All A couple months ago we got an email from google, telling us - Mobile-first indexing enabled for https://www.impactsigns.com/ Ran the test on MOZ, Mobile usability shows 100% Last week got an email from google - New Mobile usability issues detected for impactsigns.com Top new issues found, ordered by number of affected pages: Content wider than screen Clickable elements too close together I can not seem to figure out what those issues are, as all content is visible. How important are these 2 issues? Since we are now on the mobile first side?
Algorithm Updates | | samoos0 -
What does it mean to build a 'good' website.
Hi guys. I've heard a lot of SEO professionals, Google, (and Rand in a couple of whiteboard Friday's) say it's really important to build a 'good' website if you want to rank well. What does this mean in more practical terms? (Context... I've found some sites rank much better than they 'should' do based on the competition. However, when I built my own site (well-optimised (on-page) based on thorough keyword research) it was nowhere to be found (not even top 50 after I'd 'matched' the backlink profile of others on page 1). I can only put this down to there being 'good quality website' signals lacking in the latter example. I'm not a web developer so the website was the pretty basic WordPress site.)
Algorithm Updates | | isaac6630 -
Google Webmaster Tools show the error in Manual Action while there is no any error in Structured Data Testing Tool.
It is showing error as below Spammy structured markup Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Spammy Structured Markup guidelines. While I see in Structured Data Testing Tool, it doesn't show any error.
Algorithm Updates | | infinitemlm0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
How Can I Prevent Duplicate Page Title Errors?
I am working on a website that has two different sections, one for consumers and one for business. However, the products and the product pages are essentially the same but, of course, the pricing and quantities may be different. We just have different paths based on the kind of customer. And, we get feeds from manufacturers for the content so it's difficult to change it. We want Google to index both sections of the site but we don't want to get hammered for duplicate page titles and content. Any suggestions? Thanks!
Algorithm Updates | | JillCS0