Robots.txt file issue.
-
Hi,
Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help.
Robots.txt blocked 2k important URL's of my blogging site
http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350.
I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool.
What Can I do know to have these blocked URL's back in Google index?
1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool?
2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's.
I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
-
Robert some good signs of life.New sitemap shows 5080 pages submitted and 4817 indexed.
These remaining pages are surely blocked ones?RightRobert though there is some improvement in Impressions and Clicks.Thanks a lot for staying that long with me solving this issue.
-
Christopher,
Have you looked at indexing in GWMT to see if they have indexed, how many pages, etc.?
-
Got your point but I Resubmit and its status is still pending.
I have test it and it was working but when I submit it 2 days ago up till now its status is pending. -
No, when you resubmit or submit a "new" sitemap, it just tells Google this is the sitemap now. There is no content issue with a sitemap.
Best,Robert
-
Just one last question Robert.Does not the duplicate sitemap creates duplicate pages in searches?
Sorry my question may looks like Crazy to you but at the moment with applying every possible fix I do not mess up and make things even more worse.
-
Given the only issue was the robots.txt error, I would resubmit. I do think it would not hurt to generate a sitemap and submit that in case there may be something you are missing though.
Best
-
Robert the question is either I need to create a new sitemap or resubmit the existing one?
-
Hello Christopher
It appears you have done a good deal to remediate the situation already. I would resubmit a sitemap to Google also. Have you looked in WMT to see what is now indexed? I would look at the graph of indexed and robots.txt and see if you are moving the needle upward again.
This begs a second question of "How did it happen?" You stated, "Robots.txt blocked 2k important URL's of my blogging site" and that sounds like it just occurred out of the ether. I would want to know that I had found the reason and make sure I have a way to keep it from happening going forward. (just a suggestion).Lastly, using the Index Status in WMT should be a great way to learn how effective what you tried in fixing it is. I like knowing that type of data and storing it somewhere retrievable for the future.
Best to you,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
Been stuck on seo duplication issues shopify
hey there we have been working on some of our webshops and recently started with analytics/moz,but we have basicly hit a brick wall when it comes to www.krawattenwelt.de since we have had 5k high priority issues (duplicate content) and 20k medium priority issues now i have tried a large amount of solutions regarding the duplicate content issues but it didnt work so we basicly reverted it back to for now and i have the feeling i am really running out of options is there anyone who has an idea on how to do this? duplicate content issues are as follows example:http://krawattenwelt.de/collections/budget-9-15 issues with:http://krawattenwelt.de/collections/budget-9-15/modell_normal and with:http://krawattenwelt.de/collections/budget-9-15/modell_normal?page=1
Reporting & Analytics | | WebMaster2050 -
Potential Security Issues Installing Google Analytics GTM
Greetings MOZ Community!! My SEO firm plans has requested that Google Analytics GTM be installed on my website to better track goals and events. My developer will be responsible for installing the GTM script on my website and the SEO firm will actually configure GTM to track goals. Is anyone aware of GTM causing server errors/issues when installed? My developer claims that running GTM would create additional vulnerability to hacking. Is anyone aware of potential security risks from running Google Tag Manager? Once Google Tag Manager is installed, will it make the site more difficult for my developer to administer? Will there be increased risk of the code breaking? Overall, is GTM a big improvement over the existing version of Google Analytics? Is it worth the effort to install? My SEO firm is very much in favor of this upgrade. I would like to move forward provided there is no major downside. I look forward to feedback. Thanks, Alan
Reporting & Analytics | | Kingalan10 -
How to get a list of robots.txt file
This is my site. http://muslim-academy.com/ Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt In Google Webmaster its not showing up.Just giving the number of blocked URL's. Any plugin or Software to extract the list of blocked URL's.
Reporting & Analytics | | csfarnsworth0 -
Having issues with pivot tables in GA
For the past three days, I've been trying to use pivot tables on a couple of reports, most importantly I'm trying to use a pivot table on the All Pages report. I'd like to pull it for a year so I can see what our traffic sources are for top pages. I keep getting an error in Google Analytics that says, "Resource is Not Available. Please Try Again Later." See attached image. I've condensed my date range to a month, just in case a year's data was too much and I still get the error. I'm unable to use pivot tables in ANY report, without getting this message. Is anyone else having issues?EgZ25
Reporting & Analytics | | Aggie1 -
Woes in organic ranking... Post Penguin issue?
Hello all, First post here. Site:http://www.symbolphoto.com I'd done a ton of work to get my SEO where it needs to be, however, after looking at my traffic post April 15-22nd, it's gone completely downhill. I don't rank organically, which i'd like for one single term 'b*ston wedding photographer'. What am i missing? I'm assuming Penguin may be a part of this, but i haven't participated in any link schemes... am i being penalized and if so, is it evident why? Help is much appreciated! replace * with o.
Reporting & Analytics | | symbolphoto0 -
Adding Something to htaccess File
When I did a google search for site.kisswedding.com (my website) I noticed that google is indexing all of the https versions of my site. First of all, I don't get it because I don't have an SSL certificate. Then, last night I did what my host (bluehost) told me to do. I added the below to my htaccess file. Below rule because google is indexing https version of site - https://my.bluehost.com/cgi/help/758RewriteEngine OnRewriteCond %{HTTP_HOST} ^kisswedding.com$ [OR]RewriteCond %{HTTP_HOST} ^kisswedding.com$RewriteCond %{SERVER_PORT} ^443$RewriteRule ^(.*)$ http://www.kisswedding.com [R=301,L] Tonight I when I did a google search for site:kisswedding.com all of those https pages were being redirected to my home page - not the actually page they're supposed to be redirecting to. I went back to Bluehost and they said and 301 redirect shouldn't work because I don't have an SSL certificate. BUT, I figure since it's sorta working I just need to add something to that htaccess rule to make sure it's redirected to the right page. Someone in the google webmaster tools forums told me to do below but I don't really get it? _"to 301 redirect from /~kisswedd/ to the proper root folder you can put this in the root folder .htaccess file as well:_Redirect 301 /~kisswedd/ http://www.kisswedding.com/" Any help/advice would be HUGELY appreciated. I'm a bit at a loss.
Reporting & Analytics | | annasus0 -
Meta Robots Tag - What's it really mean?
I used on a handful of pages recently and noticed that they're still popping up in the Google search index. I'd like to keep these from appearing, so I figured I needed a directive statement with stronger semantic meaning. From what I understand, is what I'm looking for. Using this will keep Google from not only crawling the page, but indexing the page, as well. I decided to see what the official robotstxt.org website said about it, so I checked (link here): the NOFOLLOW directive only applies to links on this page. It's entirely likely that a robot might find the same links on some other page without a NOFOLLOW (perhaps on some other site), and so still arrives at your undesired page. So, is their explanation saying that the page itself will be indexed, but the content / links on it won't be followed / indexed? Let me hear your thoughts, mozzers.
Reporting & Analytics | | mudbugmedia0