Robots.txt blocking Metadata description being read
-
I have updated my Robots.txt file recently to fix the description snip-it on Google reading "A description for this result is not available because of this site's robots.txt – learn more". How can I have this updated to read my META data?
-
Hi IMM,
What page are you trying to fix? And what did you change in the robots.txt file to fix it?
-
It sounds like you have something going on like what Matt Cutts talks about here
http://www.youtube.com/watch?v=KBdEwpRQRD0
You have a result showing up in the SERPs, even though it is in robots.txt. Basically, the reason it is still in the SERPs is because other pages are linking to the URL on your page.
I am going to assume that you want to keep these pages out of the index.
As you already have pages in the index, you need to get them removed, not just block them. I would suggest using a noindex meta tag and then letting the crawler crawl the page. The robots.txt stops the bot cold and does not let it read anything else. It does not let the bot read the meta tag. If you let it read the noindex meta tag that tag directs Google to take the page out of the search results.
https://support.google.com/webmasters/answer/93708?hl=en
"When we see a noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it."
That said, if you have made a mistake and have been blocking Google when you did not mean to. Make sure that you do not use the noindex meta tag on those pages, and make sure you are not blocking Google in your robots.txt. If that is the case and you are still seeing the wrong info in the SERPs, you do need to wait a little while. The updates will not be instantaneous and may take a few weeks before being updated in the SERPs. In the mean time, just double check that everything in your robots.txt is correct etc.
Be patient and good luck!
-
I only have pages in the checkout flow blocked. My homepage and other pages should be accessible.
-
If you deny the robot access to the URL then it will not be able to read the meta. There isn't a way around this - as that is the purpose of denying robots. If the robot is allowed then it will update when googlebot recrawls your page and updates its index. How long that takes varies from site to site and page to page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting indexed in Google Scholar
Hi all! We have a client who publishes scholarly research as a highly regarded non-profit. Their Publications aren't being indexed in Google Scholar 50% of the time and when they are, Google is pulling random stuff from a PDF vs. from the html page. Any advice on best practices is enormously appreciated
SERP Trends | | SimpleSearch1 -
What do we know about the "Shops" SERP Feature?
I came across this SERP Feature in a search today on a mobile device. It does not show for the same search query on desktop. What do we know about this "Shops" SERP feature? shops-blur.jpg
SERP Trends | | seoelevated0 -
Google showing different links in SERPs
Google search results are showing my site links in both URLs, "mydomain.com" and "https://mydomain.com". However the one with https is showing a favicon, and the other one is not. So i wanna keep the https one and remove the other one. I went to GSC to submit "mydomain.com" for removal and it said that URL will be deleted in ALL of its variations.So how do i delete the "mydomain.com" links? Should i just index the ones with https again? Would that work? Someone suggested me to do 301 redirect on all pages that are being displayed twice. But i am not sure if i need to do that since i am using squarespace, and both of the links lead to the same page?
SERP Trends | | winter22330 -
Has anyone used youramigo or know what Technics they use or alternative to YourAmigo
please share your experience. I don't know much about them and they approached me for our ecommerce site.
SERP Trends | | bizuhodge0 -
My website is being Cached with non-www and With WWW it is not indexed and cached
Hello Team, I had one question that my website is being indexed and cached with Non-www but with WWW it is not caching it is showing 404 error. Even each and every redirection is proper. Still it is showing an error. Can you please tell me what issue i had with my site?? Here is my links: https://webcache.googleusercontent.com/search?q=cache:nCH1DvhuQT8J:https://www.canvaschamp.com/+&cd=1&hl=en&ct=clnk&gl=usa
SERP Trends | | CommercePundit0 -
Google Fetch and Render - Partial result (resources temporarily unavailable)
Over the past few weeks, my website pages have been showing as partial in the Google Search Console. There are many resources/ files (js, css, images) that are 'temporarily unreachable'. The website files haven't had any structural changes for about 2 years (it historically has always shows as 'completed' and rendered absolutely fine in the search console). I have checked and the robots.txt is fine as is the sitemap. My host hasn't been very helpful, but has confirmed there are no server issues. My website rankings have now dropped which I think is due to these resources issues and I need to clear this issue up asap - can any one here offer any assistance? It would be hugely appreciated. Thanks, Dan
SERP Trends | | dan_550 -
What are the SEO challenges associated with private search engines, like DuckDuckGo?
I read recently that DuckDuckGo doubled in size in 2017. With their search engine, and other alternatives to Google, taking part of the search market away, how can SEO/Marketing/Web pros keep their websites optimized and get traffic from these private search engines? (Also, do any of you have experience with this? What portion of your search traffic is coming from private search engines?)
SERP Trends | | searchencrypt1 -
Sitelink (meta) descriptions in Google SERP
Hi, I am probably not the only one having this question regarding the quality of sitelinks in organic results. When your search returns sitelinks, I know the only thing you can do to avoid certain sitelinks to appear is to demote it for that search result. But what can you do about the description. For the main result you are pretty much able to create an appealing description, but when the sitelinks appear, all sitelinks display these crappy 'composed' descriptions. I've read Google did some tests a couple of years ago with multiple descriptions and multiple titles, but this doesn't cover the issue I just described. Is there a way to create 'sitelink descriptions'? Thanks in advance for your thoughts.
SERP Trends | | ConclusionDigital0