Have a Robots.txt Issue
-
I have a robots.txt file error that is causing me loads of headaches and is making my website fall off the SE grid. on MOZ and other sites its saying that I blocked all websites from finding it. Could it be as simple as I created a new website and forgot to re-create a robots.txt file for the new site or it was trying to find the old one? I just created a new one.
Google's website still shows in the search console that there are severe health issues found in the property and that it is the robots.txt is blocking important pages. Does this take time to refresh? Is there something I'm missing that someone here in the MOZ community could help me with?
-
Hi primemediaconsultants!
Did this get cleared up?
-
You not always have to do this, if you would go to domain.com/robots.txt then it should be removed maybe already. If that's the case you should be starting to see an increase in the number of pages crawled in Google Search Console.
-
This seems very helpful as I did remove it, and fetch as google, but i'm a complete novice. How do you clear server cache?
-
What does your robots.txt file contain? (or share the link)
Try removing it, clearing server cache and fetching as google again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Change of Address Tool Issue
We're currently migrating few "event" mini sites to a main site that will have a subfolder for each event.. Example:
Intermediate & Advanced SEO | | RichardUK
newsite.com/event1 The issue is that Search console is not able to verify this kind of redirect:
example.com --> 301 --> newsite.com/event Do you know any work around for this? I was thinking of using a subdomain instead which will in turn redirect to the /event subfolder.. but with each hop it will diminish the link's strength. I prefer not to leave it as subdomain as data gets mashed up in Google Analytics with subdomains and we have seen worse ranking results with subdomains. Any help is greatly appreciated.0 -
Shall I hide short product review texts from customers (to avoid google panda/quality issues)?
About 30% of product reviews that the clients of our ecommerce store submitted in the last 10 years are 3 words or less (we did not require any minimum length). Would you recommend to hide those very short review texts? Where to draw the limit?
Intermediate & Advanced SEO | | lcourse
Numeric star rating would still go into our accumulated product rating. My only concern here is what impact it may have on google ranking.
To give some context, the site has for a long time some panda/phantom related issues where there are no obvious reasons that we could point to.0 -
International Href Lang Tag Parameter Issue
Hey, let's say I'm on the following page.. site.com/product-name/product-code/?d=womens I view the page source and it looks like this.. My question is, should I remove the parameter for the hreflang tag???? I just need some clarification that NO parameter page should have a canonical tag and / or href lang with parameters..
Intermediate & Advanced SEO | | ggpaul5620 -
Rankings disappeared on main 2 keywords - are links the issue?
Hi, I asked a question around 6 months ago about our rankings steadily declining since April of 2013. I did originally reply to that topic a few days ago, but as it's so old I don't think it's been noticed. I'm posting again here, if that's an issue I'm happy to delete. Here it is for reference: http://moz.com/community/q/site-rankings-steadily-decreasing-do-i-need-to-remove-links Since the original post, I have done nothing linkbuilding-wise except posting blog posts and sharing them on Facebook, G+ and Twitter. There are some links in there which don't look great (ie spammy seo directories, which I'm sending removal requests to) although quite a lot of others are relevant. Here's my link profile: <a rel="nofollow" target="_blank">http://www.opensiteexplorer.org/links?site=www.thomassmithfasteners.com</a> I've tried to make the site more accessible - we now have a simple, responsive design and I've tried to make the content clear and concise. In short, written for humans rather than search engines. As of the end of November, 'nuts and bolts' has now disappeared completely, and 'bolts and nuts' is page 8. There are many pages much higher which are not as relevant and have no links. We still rank highly for more specialised terms - ie 'bsw bolts' and 'imperial bolts' are still page 1, but not as high as before. We get an 'A' grade on the on-page grader for 'nuts and bolts, and most above us get F. I was cautious about removing links as our profile doesn't seem too bad but it does seem as if it's that. There are a fair few questionable directories in there, no doubt about that, but our overall practice in recent years has been natural building and link earning. So - I've created a spreadsheet and identified the bad links - ie directories with any SEO connotations. I am about to submit removal requests, I thought two polite requests a couple of weeks apart prior to disavowing with Google. But am I safe to disavow straight away? I say this as I don't think I'll get too many responses from those directories. I am also gradually beefing up the content on the shop pages in case of any 'thin content' issues after advice on the previous post. I noticed 100s of broken links in webmaster tools last week due to 2 broken links on our blog that repeated on every page and have fixed those. I have also been fixing errors W3C compliance-wise. Am I right to do all this? Can anyone offer any suggestions? I'm still not 100% sure if this is Panda, Penguin or something else. My guess is Penguin, but the decline started in March 2013, which correlates with Panda. Best Regards and thanks for any help, Stephen
Intermediate & Advanced SEO | | stephenshone0 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
Help diagnosing a complex SEO issue
Good evening SEOMoz. A series events, in close succession are making it somewhat difficult for me to diagnose a cause of fluctuations in traffic. Please excuse some of the stupid moves I made, but desperation got the better of me. One of my most beloved websites was hit by Panda on January 18th. Pretty sure it was due to a CMS bug that is now fixed. The website site started to show great signs of recovery from April 19th - Panda 3.5. I'm going to be as explicit as possible with the traffic for the days that follow. Traffic was stable previously. April 20th +10%. April 21st +5%. April 22nd +5%. (half way recovered, also the first real fluctuation since the site was hit in Jan). Due to the looming over-optimisation penalty, on the 22nd I changed the titles to unoptimise them a little. (fear is a dangerous thing at times). April 23rd -10%. April 24th -10% April 25th onwards, pretty much levelled out. The websites I've seen hit by Penguin, lost around 40% of their traffic, very steeply on 24th and 25th April. So the drops aren't in keeping with my experience of Penguin. But they do coincide perfectly with the massive site-wide title change. I've haven't read anything definitive about a penalty for changing titles too often, but for obvious reasons, it makes sense. The drop seems terribly soon after changing titles, but the site is very heavily indexed. It's also worth mentioning that I did changed the titles BACK, incase it was purely the fact the titles had been slightly de-optimised, that caused the drop. I waited until May 5th. This had no positive nor negative effect. It's a lot to take in but I'd love to hear your thoughts. I'm feeling a little bamboozled looking at all the figures. There was of course the above the fold update on the 19th Jan, but lets ignore that as we've only ever had a max 1 ad per page, most pages have none.
Intermediate & Advanced SEO | | seo-wanna-bs0 -
Why should I add URL parameters where Meta Robots NOINDEX available?
Today, I have checked Bing webmaster tools and come to know about Ignore URL parameters. Bing webmaster tools shows me certain parameters for URLs where I have added META Robots with NOINDEX FOLLOW syntax. I can see canopy_search_fabric parameter in suggested section. It's due to following kind or URLs. http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=1728 http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=1729 http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=1730 http://www.vistastores.com/patio-umbrellas?canopy_fabric_search=2239 But, I have added META Robots NOINDEX Follow to disallow crawling. So, why should it happen?
Intermediate & Advanced SEO | | CommercePundit0