Severe Health issue on my site through Webmaster tools
-
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
-
I would just like to add: If you're considering signing up for something (SEV), you may as well get a real hosting package.
-
Thanks for letting us know, and glad you found a work-around. A 0-second META REFRESH sometimes acts like a 301 - it's not ideal, as you said, but it's something.
-
For anyone else with Website Tonight, I have finally found a work around if not a fix. Being that Website Tonight will not allow you to do a 301 Redirect of an old page, I have figured out that if you re-create the deleted page (just the URL not the content) and use the Meta Tag to do a REFRESH to the new page, anytime the old page is clicked on it will bring them to the new page. Not ideal, of course, for SEO purposes but at least they are no longer going to a 404 and HOPEFULLY your old link juice will pass on.
-
While creating the copy of the home-page isn't ideal, if Google hasn't indexed it, it's very likely not creating duplicate content problems. Either they're filtering it out or haven't indexed it at all (since it probably has no links/paths).
I don't think that this alone is the cause of your ranking drop, but you've got a few things going on, so it's tough to say. Unfortunately, most of the ideal solutions seem to be impossible in the Godaddy system, and that's going to continue to cause you some problems.
-
No I have not made any changes yet. Google has never preferred the /shakeology.html page. I don't think it's ever been indexed. My only problem is that since I tried to CHANGE the root url not CREAT ANOTHER VERSION my serps have seemed to tank and I am trying to avoid the duplicate content issues that I believe the /shakeology.html is causing.
-
No I have not made any changes yet. Google has never preferred the /shakeology.html page. I don't think it's ever been indexed. My only problem is that since I tried to CHANGE the root url not CREAT ANOTHER VERSION my serps have seemed to tank and I am trying to avoid the duplicate content issues that I believe the /shakeology.html is causing.
-
It's a bit dangerous to simply block "shakeology.html", if Google has preferred it for a reason - you could end up getting your root page back in the rankings, or you could end up just falling out completely. I think you'd be better off leaving it and having the "wrong" page rank, if that's the only viable option.
I'm actually still showing your root home-page ranking, though, and now the "shakeology.html" page isn't even appearing in the index. Did you already make a change?
-
My original wanted homepage is www .hompage .com
My duplicate is www. hompage .com/shakeology.html
Would it be possible and/or advisable to use a Parameter in Web Master tools to ignore the /shakeology.html?
-
Unfortunately, there just comes a point where sometimes these very narrow CMS systems hit their limits, and it can start to harm you. I don't know Website Tonight well enough to help on that (hopefully someone else does), but there may come a point where you want to consider to a more advanced platform. These days, there are a lot of options that aren't budget-breakers, although switching is always a bit tough.
-
Unfortunately no. The edit page section of Website Tonight only shows the newer homepage.com/shakeology.html version and not the original homepage.com
I am afraid to delete the homepage.com/shakeology.html in fear that I will be left with neither one.
You probably aren't seeing the preview of the .com/shakeology.html because it is not the indexed homepage. It shows for the homepage.com version. The canonical tag was me trying to redirect search engines from the new (unwanted) homepage to the original because Website tonight won't allow me to 301 it.
-
Sounds like SEO Executive has got you covered on the Godaddy front - just wanted to point out a couple of things:
(1) I'm not seeing a preview for your home-page, and I had trouble connecting to it the first time. It seems to be cahced, so this could be a fluke.
(2) Not sure if this is part of the Godaddy code, but there's a really weird tag on the home-page:
name="canonical tag" content=""/>
That might just be a reference, but it doesn't do anything. If it's supposed to actually be a canonical, then something is broken.
-
Yes I saw that but unfortunately the organize your site page on website tonight only shows the new page. I'm afraid to delete it and lose both.
-
I found some great info here that I believe explains it: http://support.godaddy.com/help/2986/organizing-your-website-using-the-organize-site-page
-
I really do appreciate all of your help. Here's the issue I'm having with this though... After I renamed the homepage file to add /shakeology.html to it (because I thought it would be beneficial to have a main keyword in the url) Website Tonight only shows me the homepage.com/shakeology.html and not the homepage.com. I'm afraid that if I delete /shakeology.html I will show neither one and in essence, according to Website Tonight just be deleting my homepage. I'm not sure how to properly accomplish what I'm looking to do without screwing myself any further??
-
Personally, that's what I would do is delete it. Unless, there is a reason you need that page.
-
Since I can't 301 it, would it be bad to delete the dupe page?
-
Yes thanks. I foolishly renamed my home page and caused a duplicate page. Website tonight will not allow me to do a 301 redirect. I put a canonical tag on the /shakeology. Should this do the trick?
-
Your welcome! I'm also sending the other side of the story not to confuse you but to allow you to make a decision based on both sides: http://groups.google.com/a/googleproductforums.com/forum/#!category-topic/webmasters/crawling-indexing--ranking/8nyxCtv9RHM
-
Oh OK great. Thanks so much for your help. I just got nervous because Google puts up the Severe Health Issue warning everytime I get crawled.
-
This is a javascript file and I don't see it being an issue unless Google thinks your hiding it to be spammy . Also, there are some that say it's a benefit to block js files from the search for SEO purposes. Here is an example of that situation: http://www.seomofo.com/advanced/do-not-let-google-crawl-javascript.html I think since this is out of your control and goes by the standard of how Godaddy sets up there sites, then it shouldn't be an issue.
-
It was just crawled. And it was after robots.txt was uploaded. This is the page it lists: siteUtil.js
-
Also, the following are duplicates: http://www.shakes4life.com/shakeology.html & http://www.shakes4life.com
-
When did Google last index your site? You can check this through webmaster tools. When did you instal the robots.txt file. The reason I ask: If Google's last crawl was before you uploaded your robots file then that could be the issue. Please look at these statistics and verify this before we move further.
-
Is Google webmaster tools giving you the specific name of the files that are being blocked?
-
Is there something on it that would be detrimental to my SERPS?
-
Yes. When I type www.shakes4life.com/robots.txt the same list shows.
-
Can you place the following in your browser and replace website with your domain name and www or non www in front. website.com/robots.txt
Let me know if you see the same stuff you sent me in your last response
-
Below is the robots.txt Website Tonight Creates when I tell it to allow all pages:
User-agent: *
Allow: /
User-agent: *
Disallow: /cache/
Disallow: /_backup/
Disallow: /_mygallery/
Disallow: /_temp/
Disallow: /_tempalbums/
Disallow: /_tmpfileop/
Disallow: /dbboon/
Disallow: /Flash/
Disallow: /images/
Disallow: /plugins/
Disallow: /scripts/
Disallow: /stats/
Disallow: /statshistory/
Disallow: /WstxSearchResults.html
Disallow: /WstxSearchResults.php
Disallow: /QSC/ -
Yes you are correct. I forgot to mention (sorry) that I do use S.E.V. It allows you to create a robots.txt and lets you choose pages to block. However, even when you choose allow all, by default it blocks certain files. Go Daddy tells me they are only system files but Google tells me an important page is blocked.
-
From what I know, Godaddy Website Tonight does not offer you the opportunity to create a custom robots.txt. I believe you have to sign up for there Search Engine Visibility services. Here is some more information: http://support.godaddy.com/help/article/5321
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
Does Google add parameters to the URL parameters in webmaster tools/
I am seeing new parameters added (and sometimes removed) from the URL Parameter tool. Is there anything that would add parameters to the tool? Or does it have to be someone internally? FYI - They always have no date in the configured column, no effect set, and crawl is set to Let Google decide.
Technical SEO | | merch_zzounds0 -
Disavow file and backlinks listed in webmaster tools
Hi guys, I've sent a disavow file via webmaster tools. After that, should the backlinks from domains listed in that file disappear from the list of links to my website in webmaster tools? Or does webmaster tools show all the links, whether I've sent disavow file or not?
Technical SEO | | superseopl0 -
Strange Webmaster Tools Crawl Report
Up until recently I had robots.txt blocking the indexing of my pdf files which are all manuals for products we sell. I changed this last week to allow indexing of those files and now my webmaster tools crawl report is listing all my pdfs as not founds. What is really strange is that Webmaster Tools is listing an incorrect link structure: "domain.com/file.pdf" instead of "domain.com/manuals/file.pdf" Why is google indexing these particular pages incorrectly? My robots.txt has nothing else in it besides a disallow for an entirely different folder on my server and my htaccess is not redirecting anything in regards to my manuals folder either. Even in the case of outside links present in the crawl report supposedly linking to this 404 file when I visit these 3rd party pages they have the correct link structure. Hope someone can help because right now my not founds are up in the 500s and that can't be good 🙂 Thanks is advance!
Technical SEO | | Virage0 -
301 redirecting old content from one site to updated content on a different site
I have a client with two websites. Here are some details, sorry I can't be more specific! Their older site -- specific to one product -- has a very high DA and about 75K visits per month, 80% of which comes from search engines. Their newer site -- focused generally on the brand -- is their top priority. The content here is much better. The vast majority of visits are from referrals (mainly social channels and an email newsletter) and direct traffic. Search traffic is relatively low though. I really want to boost search traffic to site #2. And I'd like to piggy back off some of the search traffic from site #1. Here's my question: If a particular article on site #1 (that ranks very well) needs to be updated, what's the risk/reward of updating the content on site #2 instead and 301 redirecting the original post to the newer post on site #2? Part 2: There are dozens of posts on site #1 that can be improved and updated. Is there an extra risk (or diminishing returns) associated with doing this across many posts? Hope this makes sense. Thanks for your help!
Technical SEO | | djreich0 -
Webmaster Tools/Time spent downloading a page
Hi! Is it preferable for the "time spent downloading a page" in Google webmaster tools to be high or low? I've noticed that this metric rapidly decreased after I moved my site to WP Engine and I'm trying to figure out if it's a good or bad thing. Thanks! Jodi QK8dp QK8dp
Technical SEO | | JodiFTM0 -
Does adding Tool Tips to a site hurt it's SEO?
I'm wanting to add tool tips to my site as it's intended for non-technical people that are wanting high tech equipment and services. I thought that by adding tool tips, I could clear any confusion they may have about a particular word right there rather then them having to search for what it means. I did some research online and saw that it may hurt SEO ratings but wanted to verify here first before deciding.
Technical SEO | | sDevik0 -
Use webmaster tools "change of address" when doing rel=canonical
We are doing a "soft migration" of a website. (Actually it is a merger of two websites). We are doing cross site rel=canonical tags instead of 301's for the first 60-90 days. These have been done on a page by page basis for an entire site. Google states that a "change of address" should be done in webmaster tools for a site migration with 301's. Should this also be done when we are doing this soft move?
Technical SEO | | EugeneF0