Robots.txt file issue.
-
Hi,
Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help.
Robots.txt blocked 2k important URL's of my blogging site
http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350.
I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool.
What Can I do know to have these blocked URL's back in Google index?
1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool?
2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's.
I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
-
Robert some good signs of life.New sitemap shows 5080 pages submitted and 4817 indexed.
These remaining pages are surely blocked ones?RightRobert though there is some improvement in Impressions and Clicks.Thanks a lot for staying that long with me solving this issue.
-
Christopher,
Have you looked at indexing in GWMT to see if they have indexed, how many pages, etc.?
-
Got your point but I Resubmit and its status is still pending.
I have test it and it was working but when I submit it 2 days ago up till now its status is pending. -
No, when you resubmit or submit a "new" sitemap, it just tells Google this is the sitemap now. There is no content issue with a sitemap.
Best,Robert
-
Just one last question Robert.Does not the duplicate sitemap creates duplicate pages in searches?
Sorry my question may looks like Crazy to you but at the moment with applying every possible fix I do not mess up and make things even more worse.
-
Given the only issue was the robots.txt error, I would resubmit. I do think it would not hurt to generate a sitemap and submit that in case there may be something you are missing though.
Best
-
Robert the question is either I need to create a new sitemap or resubmit the existing one?
-
Hello Christopher
It appears you have done a good deal to remediate the situation already. I would resubmit a sitemap to Google also. Have you looked in WMT to see what is now indexed? I would look at the graph of indexed and robots.txt and see if you are moving the needle upward again.
This begs a second question of "How did it happen?" You stated, "Robots.txt blocked 2k important URL's of my blogging site" and that sounds like it just occurred out of the ether. I would want to know that I had found the reason and make sure I have a way to keep it from happening going forward. (just a suggestion).Lastly, using the Index Status in WMT should be a great way to learn how effective what you tried in fixing it is. I like knowing that type of data and storing it somewhere retrievable for the future.
Best to you,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does intercom support pages and redirect issue can affect the SEO performance of our website?
I noticed that in the redirect issues I have, most of the issues are coming from our Intercom support links. I want to ask, does intercom support pages and redirect issue can affect the SEO performance of our website?
Reporting & Analytics | | Envoke-Marketing0 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Transitioning to HTTPS, Do i have to submit another disavow file?
Hi Mozzers! So we're finally ready to move from http to https. Penguin is finally showing us some love due to the recent algorithm updates. I just added the https to Google search console and before 301 redirecting the whole site to a secure environment.....do i upload the same disavow file to the https version? Moreover, is it best to have both the http and https versions on Google Search console? Or should I add the disavow file to the https version and delete the http version in a month? And what about Bing? Help.
Reporting & Analytics | | Shawn1240 -
On Google Analytics, Pages that were 301 redirected are still being crawled. What's the issue here?
URL that we redirected are being crawled on Google Analytics. Since they dont exist, they have high bounce rates. What can the issue be?
Reporting & Analytics | | prestigeluxuryrentals.com0 -
Longevity of robot.txt files on Google rankings
This may be a difficult question to answer without a ton more information, but I'm curious if there's any general thought that could shed some light on the following scenario I've recently heard about and wish to be able to offer some sound advice: An extremely reputable non-profit site with excellent ranking had gone through a re-design and change-over into WordPress. A robots.txt file was used during development on the dev site on the dev server. Two months later it was noticed through GA that traffic was way down to the site. It was then discovered that the robot.txt file hadn't been removed and the new site (same content, same nav) went live with it in place. It was removed and a site index forced. How long might it take for the site to re-appear and regain past standing in the SERPs if rankings have been damaged. What would the expected recovery time be?
Reporting & Analytics | | gfiedel0 -
Having Issue with Site Search in Analytics
Hi Mozzers, We launched a website in October 2012 and have added in the settings(Google analytics) of that profile "Do Track Site Search" since we have a search box on the website. The site search report worked for 10 days and it was over(from end of december till beginning of January 2013). Since then I have been trying to understand this issue. I have added all the query search terms possible, but still not showing any signs of life. At this point I am not sure what to do? Some Help would be appreciated! Search URL= subdomain.example.com**/search/node/**.... Thanks! z93cGUZ.png
Reporting & Analytics | | Ideas-Money-Art0 -
2 questions on avoiding issues with Google and while being right in it.
Hi SEOmoz community In fact I have two questions I would like to ask (with future SEO in mind). Do you consider a WordPress Multisite or various Single installs 'safer' for SEO? Theoretically, having various sites packed into one Multisite network seems like an ideal solution. However, is there a chance that once a site in the network encounters a little 'negative turbulence', that your other sites in the network might get impacted too due to the cross-referencing, linked account i.e. Webmaster Tools etc.? It would seem outrageous, but then again I wouldn't rule it out. Do I even have to go as far as setting up new Gmail, Google Analytics and Webmaster Tools accounts, so they (the sites) are technically not linked? You can see, I don't trust search engines one bit... Is there still a point posting articles once Google is having a hissy fit with your site? Basically I am currently going through a 'rankings and traffic drops storm'. It's not as bad as being de-indexed, but it's still having enough of an impact. In addition, Google does not seem to treat my new articles (unique content) with the same attention anymore i.e. does not seem to index them 'fully' or not at all (i.e. posting the headline in Google should return the article, but it doesn't). Is there even a point spending time now and posting new material or may it pick it up again once I am through this low phase? Does Google still index what it considers worth or is it a waste of time right now to keep posting, posting and posting more? Thanks for your help. I really appreciate it.
Reporting & Analytics | | Hermski0 -
Google Analytics Goal Funnel Visualization Issue
I've setup a goal funnel but am having an issue when I look at the funnel visualization. It doesn't appear to be recognizing the 1st step of the funnel that I've defined in the goal edit page. The "Property Listing page view" is located at /listings/xxx where xxx is the number of the property. Within the funnel, I've added /listings/*, but when I go to see the funnel visualization, I see 0 counts for this step (even though it clearly shows on the entrance page to the left "/listings/622, etc". I've attached a .pdf with a few images to help make this clearer. Any thoughts? CRD-Funnel.pdf
Reporting & Analytics | | chrisfree0