Can someone explain this to me in simple language? Basically what do I have to do?
-
Accessible to Engines
Easy fix
<dl>
<dt>Crawl status</dt>
<dd>Status Code: 200
meta-robots: None
meta-refresh: 0;url=http://akaalpet.com/default.aspx
X-Robots: None</dd><dt>Explanation</dt>
<dd>Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible.</dd>
<dt>Recommendation</dt>
<dd>Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)</dd>
</dl>
-
I was certain that we addressed our redirect issue and I am very surprised to see it come up. Can someone perhaps shed light on why we are getting Accessible to Engines response?
This is the page I am crawling...
http://www.nutrivinevitamins.com/product/pure-encapsulations/
-
Thank you so much for explaining it in detail. It totally makes sense now.
-
Hi Devinder,
Definitely 100% get rid of that Meta refresh. I am just now dealing with this exact same problem on a large site for which I do in-house SEO. Try this, put the URL http://agaalpet.com into Open Site Exporer and compare it to http://akaalpet.com/default.aspx
Thanks very much to Jenn Lopes at SEOMoz I was able to articulate to management why this was bad and worth spending (in our case) $1,000 to have it fixed.
You are fragmenting the authority of what is probably your most authoritative page by having the meta refresh. It's definitely not OK in Google's eyes because people used to use it for nefarious purposes [SPAM]. Here's a screenshot. See how it's fragmenting your links? Oh heay, one other important thing is that your most authoritative page, http://www.akaalpet.com isn't passing any of it's link juice down to your other pages. The meta refresh is causing that too. Hope that helps!
-
Thank you Phil. Yeah it does makes sense. So is it better to remove it or its ok in google's eye?
-
The problem is with this line:
meta-refresh: 0;url=http://akaalpet.com/default.aspx
You're basically telling the search engines "I know you came here to check out www.akaalpet.com but I'd rather you head over to http://akaalpet.com/default.aspx"
Because of that, when someone visits www.akaalpet.com you're redirecting them to that other URL right away.
Make sense?
-
You need to create a robot.txt file and upload it to your server.
www.akaalpet.com/robot.txt does not exist...
This should help you set it up: http://www.seomoz.org/learn-seo/robotstxt
-
the URL is www.akaalpet.com that is giving this error but if I try http://akaalpet.com/default.aspx then it doesn't show the error. What you think I should do?
-
Without a URL it's difficult to tell... I would check your robot.txt file for a disallow code or check in you Google Webmater tools to see if you are not indexing this page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can service request pages be indexed for a service site?
I think there is no point in indexing service request pages for a service site. And it causes the indexing of the main pages to be done with a delay. Does anyone have experience with indexing service request pages and their results?
On-Page Optimization | | sora.ya04680 -
How can i check which inbound links to my site go to 404 pages?
I have external links coming into my site going to 404 pages, but i cant seem to find a way to search all broken links pointed at my website.
On-Page Optimization | | NickJPearse0 -
Is there a tool that I can use to scrape and see metatags?
Looking for a tool that allows me to scrape the websites off a page listing of Google and output a spreadsheet with the websites and their related meta-tag details (mainly title tag). Is there a tool out there that can conveniently allow me to do this?
On-Page Optimization | | Gavo0 -
Why Can't I Get Indexed?
I cannot seem to get my website indexed by Google! I submitted the sitemap using Google WMT about a month ago but only one page is being indexed. There are very few backlinks to the site, so I don't believe there are any penalties due to over-optimization that would prevent indexing. Also, my robots.txt file is properly configured and is not preventing any pages from being crawled. I've tried using the "Fetch as Google" settings in WMT with no luck. Any ideas?
On-Page Optimization | | socialfirestarter0 -
How can I find out which filter my site has fallen under?
How can I find out which filter my site has fallen under? there only 22 pages (out of 380) that are not under Google filters Thank you in advance.
On-Page Optimization | | andreysmiling19870 -
Can Sitemap Be Used to Manage Canonical URLs?
We have a duplicate content challenge that likely has contributed to us loosing SERPs especially for generic keywords such as "audiobook," "audiobooks," "audio book," and "audio books." Our duplicate content is on two levels. 1. The first level is at our web store, www.audiobooksonline.com. Audiobooks are sometimes published in abridged, unabridged, on compact discs, on MP3 CD by the same publisher. In this case we use the publisher description of the story for each "flavor" = duplicate content. Can we use our sitemap to identify only one "flavor" so that a spider doesn't index the others? 2. The second level is that most online merchants of the same publisher's audio book use the same description of the story = lots of duplicate content on the Web. In that we have 11,000+ audio book titles offered at our Web store, I expect Google sees us as having lots of duplicated (on the Web) content and devalues our site. Some of our competitors who rank very high for our generic keywords use the same publisher's description. Any suggestions on how we could make our individual audio book title pages unique will be greatly appreciated.
On-Page Optimization | | lbohen0 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0