Severe health issues are found on your site. - Check site health (GWT)
-
Hi,
We run a Magento website - When i log in to Google Webmaster Tools, I am getting this message:
Severe health issues are found on your site. - <a class="GNHMM2RBFH">Check site health
</a>Is robots.txt blocking important pages? Some important page is blocked by robots.txt.Now, this is the weird part - the page being blocked is the admin page of magento - under
www.domain.com/index.php/admin/etc.....Now, this message just wont go away - its been there for days now - so why does Google think this is an "important page"? It doesnt normally complain if you block other parts of the site ??
Any ideas?
THanks
-
This is the robots file line:
Disallow: /index.php/Now, Magento has custom URL rewriting - so no URL uses index.php although this is the file that is used to run the system itself.
Should I unblock it?
-
Thanks - I thought so, just wanted to double check . Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old sitemaps after site migration.
Hi, I was wondering if it's safe to remove all the sitemaps from the old site in search console? It's been 3 months since site migration from http://sitea.com (301 redirected) to http://siteb.com. Therefore, can I delete the old sitemap from the http://sitea.com from search console? Thanks.
Intermediate & Advanced SEO | | ggpaul5620 -
Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Hi, We had a site with many bad links pointing to it (.co.uk). It was knocked from the SERPS. We tried to manually ask webmasters to remove links.Then submitted a Disavow and a recon request. We have since moved the site to a new URL (.com) about a year ago. As the company needed it's customer to find them still. We 301 redirected the .co.uk to the .com There are still lots of bad links pointing to the .co.uk. The questions are: #1 Do we stop the 301 redirect from .co.uk to .com now? The .co.uk is not showing in the rankings. We could have a basic holding page on the .co.uk with 'we have moved' (No link). Or just switch it off. #2 If we keep the .co.uk 301 to the .com, shall we upload disavow to .com webmasters tools or .co.uk webmasters tools. I ask this because someone else had uploaded the .co.uk's disavow list of spam links to the .com webmasters tools. Is this bad? Thanks in advance for any advise or insight!
Intermediate & Advanced SEO | | SolveWebMedia0 -
Issue with Site Map - how critical would you rank this in terms of needing a fix?
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users. We have around 2300 pages of content, and around 600-800 of these previously excluded URLs, An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users). The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering: How much of a critical issue would you view this? Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap. Thanks
Intermediate & Advanced SEO | | KateWaite
Kate0 -
Several 301 Redirects to Same Page
Hi, I have 3 Pages we won't use anymore in our website. Let's call them url A, url B and url C. To keep their SEO strength on our domain, I've though about redirecting all of them to url D. For what I understand, when 301 redirecting, about 85-90% of the link SEO juice is passed. Then, if I redirect 3 URLs to the same page... does url D receive all the link SEO juices for URLs added up? (approximately)
Intermediate & Advanced SEO | | viatrading1
e.g. future url D juice = 100% current url D juice + 85% url A juice + 85% url B juice + 85% url C juice Is this the best practice, or is there a better way? Cheers,0 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
This site got hit but why..?
I am currently looking at taking on a small project website which was recently hit but we are really at a loss as to why so I wanted to open this up to the floor and see if anyone else had some thoughts or theories to add. The site is Howtotradecommodities.co.uk and the site appeared to be hit by Penguin because sure enough it drops from several hundred visitors a day to less than 50. Nothing was changed about the website, and looking at the Analytics it bumbled along at a less than 50 visitors a day. On June 25th when Panda 3.8 hit, the site saw traffic increase to between 80-100 visitors a day and steadily increases almost to pre-penguin levels. On August 9th/10th, traffic drops off the face of the planet once again. This site has some amazing links http://techcrunch.com/2012/02/04/algorithmsdata-vs-analystsreports-fight/
Intermediate & Advanced SEO | | JamesAgate
http://as.exeter.ac.uk/library/using/help/business/researchingfinance/stockmarket/ That were earned entirely naturally/editorially. I know these aren't "get out of jail free cards" but the rest of the profile isn't that bad either. Normally you can look at a link profile and say "Yep, this link and that link are a bit questionable" but beyond some slightly off-topic guest blogging done a while back before I was looking to get involved in the project there really isn't anything all that fruity about the links in my opinion. I know that the site design needs some work but the content is of a high standard and it covers its topic (commodities) in a very comprehensive and authoritative way. In my opinion, (I'm not biased yet because it isn't my site) this site genuinely deserves to rank. As far as I know, this site has received no unnatural link warnings. I am hoping this is just a case of us having looked at this for too long and it will be a couple of obvious/glaring fixes to someone with a fresh pair of eyes. Does anyone have any insights into what the solution might be? [UPDATE] after responses from a few folks I decided to update the thread with progress I made on investigating the situation. After plugging the domain into Open Site Explorer I can see quite a few links that didn't show up in Link Research Tools (which is odd as I thought LRT was powered by mozscape but anyway... shows the need for multiple tools). It does seem like someone in the past has been a little trigger happy with building links to some of the inner pages.0 -
Googlebot found an extremely high number of URLs on your site
I keep getting the "Googlebot found an extremely high number of URLs on your site" message in the GWMT for one of the sites that I manage. The error is as below- Googlebot encountered problems while crawling your site. Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site. I understand the nature of the message - the site uses a faceted navigation and is genuinely generating a lot of duplicate pages. However in order to stop this from becoming an issue we do the following; No-index a large number of pages using the on page meta tag. Use a canonical tag where it is appropriate But we still get the error and a lot of the example pages that Google suggests are affected by the issue are actually pages with the no-index tag. So my question is how do I address this problem? I'm thinking that as it's a crawling issue the solution might involve the no-follow meta tag. any suggestions appreciated.
Intermediate & Advanced SEO | | BenFox0 -
How important are social bookmarking sites
How important are links from social bookmarking sites from an SEO perspective?
Intermediate & Advanced SEO | | casper4340