Robots.txt file issue.
-
Hi,
Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help.
Robots.txt blocked 2k important URL's of my blogging site
http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350.
I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool.
What Can I do know to have these blocked URL's back in Google index?
1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool?
2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's.
I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
-
Robert some good signs of life.New sitemap shows 5080 pages submitted and 4817 indexed.
These remaining pages are surely blocked ones?RightRobert though there is some improvement in Impressions and Clicks.Thanks a lot for staying that long with me solving this issue.
-
Christopher,
Have you looked at indexing in GWMT to see if they have indexed, how many pages, etc.?
-
Got your point but I Resubmit and its status is still pending.
I have test it and it was working but when I submit it 2 days ago up till now its status is pending. -
No, when you resubmit or submit a "new" sitemap, it just tells Google this is the sitemap now. There is no content issue with a sitemap.
Best,Robert
-
Just one last question Robert.Does not the duplicate sitemap creates duplicate pages in searches?
Sorry my question may looks like Crazy to you but at the moment with applying every possible fix I do not mess up and make things even more worse.
-
Given the only issue was the robots.txt error, I would resubmit. I do think it would not hurt to generate a sitemap and submit that in case there may be something you are missing though.
Best
-
Robert the question is either I need to create a new sitemap or resubmit the existing one?
-
Hello Christopher
It appears you have done a good deal to remediate the situation already. I would resubmit a sitemap to Google also. Have you looked in WMT to see what is now indexed? I would look at the graph of indexed and robots.txt and see if you are moving the needle upward again.
This begs a second question of "How did it happen?" You stated, "Robots.txt blocked 2k important URL's of my blogging site" and that sounds like it just occurred out of the ether. I would want to know that I had found the reason and make sure I have a way to keep it from happening going forward. (just a suggestion).Lastly, using the Index Status in WMT should be a great way to learn how effective what you tried in fixing it is. I like knowing that type of data and storing it somewhere retrievable for the future.
Best to you,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Metadata and duplicate content issues
Hi there: I'm seeing a steady decline in organic traffic, but at the same time and increase in pageviews and direct traffic. My site has about 3,000 crawl errors!! Errors are duplicate content, missing description tags, and description too long. Most of these issues are related to events that are being imported from Google calendars via ical and the pages created from these events. Should we block calendar events from being crawled by using the disallow directive in the robots.txt file? Here's the site: https://www.landmarkschool.org/
Reporting & Analytics | | BGR0 -
Https Referral Issues
Hi Moz! We are having some trouble passing along referral data to our linked websites. We are an advertising platform that counts on our clients seeing our traffic in analytics. We recently switched to HTTPS and implemented the meta referral <always>however, we are still experiencing issues with clients not seeing our referral traffic. Are there other ways to make a full-proof fix so they will always see us as a referral? I know the referral tag should help but it isn't supported by all browsers. </always> Also, would there be an issue if we have our site (https) linked to their http site that redirects to https? Even with the referral tag, we are seeing issues. I feel so lucky that our actual transition to https went fine but now with all this referral traffic on clients analytics, I am concerned we are losing the credit we should get for the traffic to their site. Thanks Moz Community!
Reporting & Analytics | | GoAbroadKP0 -
Any issues with Google impressions dropping in Webmaster Tools?
I'm seeing a drop in impressions across all my websites that are hosted at a certain location. Just wanted to make sure that it is not some reporting issue that others are seeing.
Reporting & Analytics | | tdawson090 -
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Which Algorithm Change Hurt the Site? A causation/correlation issue
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally 🙂 Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released. Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases 🙂 We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day JarMzoK.png
Reporting & Analytics | | Fammy0 -
How to get a list of robots.txt file
This is my site. http://muslim-academy.com/ Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt In Google Webmaster its not showing up.Just giving the number of blocked URL's. Any plugin or Software to extract the list of blocked URL's.
Reporting & Analytics | | csfarnsworth0 -
2 questions on avoiding issues with Google and while being right in it.
Hi SEOmoz community In fact I have two questions I would like to ask (with future SEO in mind). Do you consider a WordPress Multisite or various Single installs 'safer' for SEO? Theoretically, having various sites packed into one Multisite network seems like an ideal solution. However, is there a chance that once a site in the network encounters a little 'negative turbulence', that your other sites in the network might get impacted too due to the cross-referencing, linked account i.e. Webmaster Tools etc.? It would seem outrageous, but then again I wouldn't rule it out. Do I even have to go as far as setting up new Gmail, Google Analytics and Webmaster Tools accounts, so they (the sites) are technically not linked? You can see, I don't trust search engines one bit... Is there still a point posting articles once Google is having a hissy fit with your site? Basically I am currently going through a 'rankings and traffic drops storm'. It's not as bad as being de-indexed, but it's still having enough of an impact. In addition, Google does not seem to treat my new articles (unique content) with the same attention anymore i.e. does not seem to index them 'fully' or not at all (i.e. posting the headline in Google should return the article, but it doesn't). Is there even a point spending time now and posting new material or may it pick it up again once I am through this low phase? Does Google still index what it considers worth or is it a waste of time right now to keep posting, posting and posting more? Thanks for your help. I really appreciate it.
Reporting & Analytics | | Hermski0 -
Google and bing search filed commands
Dose someone have / know a full list / resource with commands for google and bing ? Including filters for those commands ? (site:domain.com -filter etc) (like: site:domain.com, link:domain.com etc) I use the basic ones b ut I know there are much more and that there are several filters that can be used with success to filter down results. Thanks.
Reporting & Analytics | | eyepaq1