Severe Health issue on my site through Webmaster tools
-
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
-
I would just like to add: If you're considering signing up for something (SEV), you may as well get a real hosting package.
-
Thanks for letting us know, and glad you found a work-around. A 0-second META REFRESH sometimes acts like a 301 - it's not ideal, as you said, but it's something.
-
For anyone else with Website Tonight, I have finally found a work around if not a fix. Being that Website Tonight will not allow you to do a 301 Redirect of an old page, I have figured out that if you re-create the deleted page (just the URL not the content) and use the Meta Tag to do a REFRESH to the new page, anytime the old page is clicked on it will bring them to the new page. Not ideal, of course, for SEO purposes but at least they are no longer going to a 404 and HOPEFULLY your old link juice will pass on.
-
While creating the copy of the home-page isn't ideal, if Google hasn't indexed it, it's very likely not creating duplicate content problems. Either they're filtering it out or haven't indexed it at all (since it probably has no links/paths).
I don't think that this alone is the cause of your ranking drop, but you've got a few things going on, so it's tough to say. Unfortunately, most of the ideal solutions seem to be impossible in the Godaddy system, and that's going to continue to cause you some problems.
-
No I have not made any changes yet. Google has never preferred the /shakeology.html page. I don't think it's ever been indexed. My only problem is that since I tried to CHANGE the root url not CREAT ANOTHER VERSION my serps have seemed to tank and I am trying to avoid the duplicate content issues that I believe the /shakeology.html is causing.
-
No I have not made any changes yet. Google has never preferred the /shakeology.html page. I don't think it's ever been indexed. My only problem is that since I tried to CHANGE the root url not CREAT ANOTHER VERSION my serps have seemed to tank and I am trying to avoid the duplicate content issues that I believe the /shakeology.html is causing.
-
It's a bit dangerous to simply block "shakeology.html", if Google has preferred it for a reason - you could end up getting your root page back in the rankings, or you could end up just falling out completely. I think you'd be better off leaving it and having the "wrong" page rank, if that's the only viable option.
I'm actually still showing your root home-page ranking, though, and now the "shakeology.html" page isn't even appearing in the index. Did you already make a change?
-
My original wanted homepage is www .hompage .com
My duplicate is www. hompage .com/shakeology.html
Would it be possible and/or advisable to use a Parameter in Web Master tools to ignore the /shakeology.html?
-
Unfortunately, there just comes a point where sometimes these very narrow CMS systems hit their limits, and it can start to harm you. I don't know Website Tonight well enough to help on that (hopefully someone else does), but there may come a point where you want to consider to a more advanced platform. These days, there are a lot of options that aren't budget-breakers, although switching is always a bit tough.
-
Unfortunately no. The edit page section of Website Tonight only shows the newer homepage.com/shakeology.html version and not the original homepage.com
I am afraid to delete the homepage.com/shakeology.html in fear that I will be left with neither one.
You probably aren't seeing the preview of the .com/shakeology.html because it is not the indexed homepage. It shows for the homepage.com version. The canonical tag was me trying to redirect search engines from the new (unwanted) homepage to the original because Website tonight won't allow me to 301 it.
-
Sounds like SEO Executive has got you covered on the Godaddy front - just wanted to point out a couple of things:
(1) I'm not seeing a preview for your home-page, and I had trouble connecting to it the first time. It seems to be cahced, so this could be a fluke.
(2) Not sure if this is part of the Godaddy code, but there's a really weird tag on the home-page:
name="canonical tag" content=""/>
That might just be a reference, but it doesn't do anything. If it's supposed to actually be a canonical, then something is broken.
-
Yes I saw that but unfortunately the organize your site page on website tonight only shows the new page. I'm afraid to delete it and lose both.
-
I found some great info here that I believe explains it: http://support.godaddy.com/help/2986/organizing-your-website-using-the-organize-site-page
-
I really do appreciate all of your help. Here's the issue I'm having with this though... After I renamed the homepage file to add /shakeology.html to it (because I thought it would be beneficial to have a main keyword in the url) Website Tonight only shows me the homepage.com/shakeology.html and not the homepage.com. I'm afraid that if I delete /shakeology.html I will show neither one and in essence, according to Website Tonight just be deleting my homepage. I'm not sure how to properly accomplish what I'm looking to do without screwing myself any further??
-
Personally, that's what I would do is delete it. Unless, there is a reason you need that page.
-
Since I can't 301 it, would it be bad to delete the dupe page?
-
Yes thanks. I foolishly renamed my home page and caused a duplicate page. Website tonight will not allow me to do a 301 redirect. I put a canonical tag on the /shakeology. Should this do the trick?
-
Your welcome! I'm also sending the other side of the story not to confuse you but to allow you to make a decision based on both sides: http://groups.google.com/a/googleproductforums.com/forum/#!category-topic/webmasters/crawling-indexing--ranking/8nyxCtv9RHM
-
Oh OK great. Thanks so much for your help. I just got nervous because Google puts up the Severe Health Issue warning everytime I get crawled.
-
This is a javascript file and I don't see it being an issue unless Google thinks your hiding it to be spammy . Also, there are some that say it's a benefit to block js files from the search for SEO purposes. Here is an example of that situation: http://www.seomofo.com/advanced/do-not-let-google-crawl-javascript.html I think since this is out of your control and goes by the standard of how Godaddy sets up there sites, then it shouldn't be an issue.
-
It was just crawled. And it was after robots.txt was uploaded. This is the page it lists: siteUtil.js
-
Also, the following are duplicates: http://www.shakes4life.com/shakeology.html & http://www.shakes4life.com
-
When did Google last index your site? You can check this through webmaster tools. When did you instal the robots.txt file. The reason I ask: If Google's last crawl was before you uploaded your robots file then that could be the issue. Please look at these statistics and verify this before we move further.
-
Is Google webmaster tools giving you the specific name of the files that are being blocked?
-
Is there something on it that would be detrimental to my SERPS?
-
Yes. When I type www.shakes4life.com/robots.txt the same list shows.
-
Can you place the following in your browser and replace website with your domain name and www or non www in front. website.com/robots.txt
Let me know if you see the same stuff you sent me in your last response
-
Below is the robots.txt Website Tonight Creates when I tell it to allow all pages:
User-agent: *
Allow: /
User-agent: *
Disallow: /cache/
Disallow: /_backup/
Disallow: /_mygallery/
Disallow: /_temp/
Disallow: /_tempalbums/
Disallow: /_tmpfileop/
Disallow: /dbboon/
Disallow: /Flash/
Disallow: /images/
Disallow: /plugins/
Disallow: /scripts/
Disallow: /stats/
Disallow: /statshistory/
Disallow: /WstxSearchResults.html
Disallow: /WstxSearchResults.php
Disallow: /QSC/ -
Yes you are correct. I forgot to mention (sorry) that I do use S.E.V. It allows you to create a robots.txt and lets you choose pages to block. However, even when you choose allow all, by default it blocks certain files. Go Daddy tells me they are only system files but Google tells me an important page is blocked.
-
From what I know, Godaddy Website Tonight does not offer you the opportunity to create a custom robots.txt. I believe you have to sign up for there Search Engine Visibility services. Here is some more information: http://support.godaddy.com/help/article/5321
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is this site ranking higher?
We've put a fair bit of effort into delivering value here: https://lewescountycars.co.uk/ But a search for "Lewes taxis" or "taxis Lewes" puts this site above us: https://www.lewestowntaxis.co.uk/ As you can you see, this is a tiny site that we outperform in most ways.... what can we do to rank above it that we haven't already done? Thanks in advance - Gerard.
Technical SEO | | Paul7301 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Clarification on indexation of XML sitemaps within Webmaster Tools
Hi Mozzers, I have a large service based website, which seems to be losing pages within Google's index. Whilst working on the site, I noticed that there are a number of xml sitemaps for each of the services. So I submitted them to webmaster tools last Friday (14th) and when I left they were "pending". On returning to the office today, they all appear to have been successfully processed on either the 15th or 17th and I can see the following data: 13/08 - Submitted=0 Indexed=0
Technical SEO | | Silkstream
14/08 - Submitted=606,733 Indexed=122,243
15/08 - Submitted=606,733 Indexed=494,651
16/08 - Submitted=606,733 Indexed=517,527
17/08 - Submitted=606,733 Indexed=517,498 Question 1: The indexed pages on 14th of 122,243 - Is this how many pages were previously indexed? Before Google processed the sitemaps? As they were not marked processed until 15th and 17th? Question 2: The indexed pages are already slipping, I'm working on fixing the site by reducing pages and improving internal structure and content, which I'm hoping will fix the crawling issue. But how often will Google crawl these XML sitemaps? Thanks in advance for any help.0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0 -
404 Errors in Google Webmaster Tools
Hello, Google webmaster tools is returning our URLs as 404 errors: http://www.celebritynetworth.com/watch/D5GrrPEN9Oc/tom-mccarthy-floating/ When we enter the URL into the browser it loads the page just fine. Is there a way to determine why Google Webmaster Tools is returning a 404 error when the link loads perfectly fine in a browser? Thanks, Alex
Technical SEO | | Anti-Alex0 -
All of my incoming links to my site are gone in Webmaster Tools!?
I just checked webmaster tools and noticed that all of the links I have acquired over the last few months are gone except for 1 website. Did something change just recently? Is this a glitch? http://www.petmedicalcenter.com Thanks in advance for your help! Brant
Technical SEO | | PMC-3120870