Have a Robots.txt Issue
-
I have a robots.txt file error that is causing me loads of headaches and is making my website fall off the SE grid. on MOZ and other sites its saying that I blocked all websites from finding it. Could it be as simple as I created a new website and forgot to re-create a robots.txt file for the new site or it was trying to find the old one? I just created a new one.
Google's website still shows in the search console that there are severe health issues found in the property and that it is the robots.txt is blocking important pages. Does this take time to refresh? Is there something I'm missing that someone here in the MOZ community could help me with?
-
Hi primemediaconsultants!
Did this get cleared up?
-
You not always have to do this, if you would go to domain.com/robots.txt then it should be removed maybe already. If that's the case you should be starting to see an increase in the number of pages crawled in Google Search Console.
-
This seems very helpful as I did remove it, and fetch as google, but i'm a complete novice. How do you clear server cache?
-
What does your robots.txt file contain? (or share the link)
Try removing it, clearing server cache and fetching as google again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking Issue for New Site
Hi all, I have got a specific SEO challenge. 6 months ago, we started to build an eCommerce site (located in the UK). In order to speed up the site launch, we copied the entire site over from an existing site based in Ireland. Now, the new UK site has been running for 5 months. Google has indexed many pages, which is good, but we can't rank high (position: between 20-30 for most pages). We thought it was because of content duplication in spite of different regions. So we tried to optimize the pages for the UK site to make them more UK-related and avoid content duplication. I've also used schema to tell google it's a UK-based site and set up Google my business and got more local citations. Besides, If you could give me any suggestions, it'd be perfect.
Intermediate & Advanced SEO | | Insightful_Media
Thank you so much for your time and advice.1 -
Need help with Robots.txt
An eCommerce site built with Modx CMS. I found lots of auto generated duplicate page issue on that site. Now I need to disallow some pages from that category. Here is the actual product page url looks like
Intermediate & Advanced SEO | | Nahid
product_listing.php?cat=6857 And here is the auto generated url structure
product_listing.php?cat=6857&cPath=dropship&size=19 Can any one suggest how to disallow this specific category through robots.txt. I am not so familiar with Modx and this kind of link structure. Your help will be appreciated. Thanks1 -
Solving pagination issues for e-commerce
I would like to ask about a technical SEO issue that may cause duplicate content/crawling issues. For pagination, how the rel=canonical, rel="prev" rel="next" and noindex tag should be implemented. Should all three be within the same page source? Say for example, for one particular category we may have 10 pages of products (product catalogues). So we should noindex page 2 onwards, rel canonical it back to the first page and also rel="prev" and rel="next" each page so Google can understand they contain multiple pages. If we index these multiple pages it will cause duplicate content issues. But I'm not sure whether all 3 tags need adding. It's also my understanding that the search results should be noindexed as it does not provide much value as an entry point in search engines.
Intermediate & Advanced SEO | | Jseddon920 -
Wordpress uploads folder issues
Hi, i have recently moved my wordpress blog to a new server.. Previously I had a url as website.com/blog My blog site is now running on the domain website.com Now most of my images are in the correct folder path wp-content/uploads Howerver, some of my images are pointing to a folder /blog/wp-content/uploads and so I am getting many missing image on the front end. How do i get the /blog/wp-content/uploads point to the new url wp-content/uploads Thanks guys.. Taiger
Intermediate & Advanced SEO | | Taiger0 -
Blocking poor quality content areas with robots.txt
I found an interesting discussion on seoroundtable where Barry Schwartz and others were discussing using robots.txt to block low quality content areas affected by Panda. http://www.seroundtable.com/google-farmer-advice-13090.html The article is a bit dated. I was wondering what current opinions are on this. We have some dynamically generated content pages which we tried to improve after panda. Resources have been limited and alas, they are still there. Until we can officially remove them I thought it may be a good idea to just block the entire directory. I would also remove them from my sitemaps and resubmit. There are links coming in but I could redirect the important ones (was going to do that anyway). Thoughts?
Intermediate & Advanced SEO | | Eric_edvisors0 -
Can URLs blocked with robots.txt hurt your site?
We have about 20 testing environments blocked by robots.txt, and these environments contain duplicates of our indexed content. These environments are all blocked by robots.txt, and appearing in google's index as blocked by robots.txt--can they still count against us or hurt us? I know the best practice to permanently remove these would be to use the noindex tag, but I'm wondering if we leave them they way they are if they can still hurt us.
Intermediate & Advanced SEO | | nicole.healthline0 -
Duplication Issue?
One of our copywriters has just written a blog to be posted on our own company blog to be reviewed by myself, however I had noticed that the blog post has some duplication issues with one of our own product pages, about 60% duplication, is it still worth posting? Will search engines still index the blog post? Kind Regards,
Intermediate & Advanced SEO | | Paul780 -
In-House SEO - Doubt about one SEO issue - Plz guys help over here =)
Hello, We wanna promote some of our software's. I will give u guys one example bellow: http://www.mediavideoconverter.de/pdf-to-epub-converter.html We also have this domain: http://pdftoepub.de/ How can we deal about the duplicate content, and also how can we improve the first domain product page. If I use the canonical and don't index the second domain and make a link to the first domain it will help anyway? or don't make any difference? keyword: pdf to epub , pdf to epub converter What u guys think about this technique ? Good / Bad ? Is there the second domain giving any value to the first domain page? Thanks in advance.
Intermediate & Advanced SEO | | augustos0