Blocking HTTP 1.0?
-
One of my clients believes someone is trying to hack their site. We are seeing the requests with a server protocol or HTTP 1.0 so they want to block 1.0 entirely.
Will this cause any problems with search engines or regular, non-spamming visitors?
-
i would think that most bots and modern browser all ise http 1.1 by now, but I am sure there are some things out there that still use 1.0. i seem to remember that some phones use 1.0, old windows media players
i would try to block them anothr way just to be sure. maybe rogerbot users 1.0
it sems a bit over kill
They may just change to 1.1
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge problem : All our Internal Links Dropped from 9.000 to 0.What happened?
Hi, I just noticed a huge large problem in our rankings. Our rankings suddenly dropped for more than 50 %. Of course, I immediately started to research the issue. And under Links, I found that we somehow lost all of our internal links! They have dropped from 9k to 0. Now, I am sure that we do have some internal links on our site ( since I put them there myself). Could you please tell me what is going on and how I can fix this issue? Our site is 1solarsolution.com and I will also attach screenshots bellow from Link Explorer, thank you. Fr08UGe
Intermediate & Advanced SEO | | alisamana0 -
HTTP → HTTPS Migration - Both Websites Live Simultaneously
We have a situation where a vendor, who manages a great deal of our websites, is migrating their platform to HTTPS. The problem is that the HTTP & new HTTPS versions will be live simultaneously (in order to give clients time to audit both sites before the hard switch). I know this isn't the way that it should be done, but this is the problem we are facing. My concern was that we would have two websites in the index, so I suggested that they noindex the new HTTPS website until we are ready for the switch. They told me that they would just add cannonicals to the HTTPS that points to the HTTP and when it's time for the switch reverse the cannonicals. Is this a viable approach?
Intermediate & Advanced SEO | | AMSI-SEO0 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Panda 4.0 Update Affected Site - What should be a the minimum Code to Text Ratio we should aim for ?
Hi All, My eCommerce site got hit badly with the Panda 4.0 update so we have been doing some site auditing and analysis identifying issues which need addressing. We have thin/duplicate issues which I am quite sure was part of the reason we were affected by this even though we use rel=next and rel=prev along with having a separate view all page although we don't concanical tag to this page as I dont' think users would benefit from seeing to many items on one page. This led me to look at our Code to Content Ratio. We have now managed to increase it from 9% to approx 18-22% on popular pages by getting rid of unnecessary code etc. My question is , is there an ideal percentage the code to content ratio should be ?.. and what should I be aiming for ? Also any other Panda 4.0 advice would also be appreciated thanks Sarah
Intermediate & Advanced SEO | | SarahCollins0 -
Consistent Ranking Jumps Page 1 to Page 5 for months - help needed
Hi guys and gals, I have a really tricky client who I just can't seem to gain consistency with in their SERP results. The keywords are competitive but what the main issue I have is the big page jumps that happen pretty much on a weekly basis. We go up and down 40 positions and this behaviour has been going on for nearly 6 months.
Intermediate & Advanced SEO | | Jon_bangonline
I felt it would resolve itself in time but it has not. The website is a large ecommerce website. Their link profile is OK in regards to several high quality newspaper publication links, majority brand related anchor texts and the link building we have engaged in has all been very good i.e. content relevant / high quality places. See below for some potential causes I think could be the reason: The on page SEO is good however the way their ecommerce website is setup they have formed a substantial amount of duplicate title tags. So in my opinion this is a potential cause. The previous web developer set-up 301 redirects all to their home page for any 404 errors. I know best practice is to go to the most relevant pages, however could this be a potential issue? We had some server connectivity issues show up in webmasters tools but that was for 1 day about 4 months ago. Since then no issues. they have quite a few 'blocked URLs' in their robots.txt file, e.g. Disallow: /login, Disallow: /checkout/ but to me these seem normal and not a big issue. We have seen a decrease over the last 12 months in Webmasters Tools of 'total indexed web pages' from 5000 to 2000 which is quite an odd statistic. Summary So all in all I am a tad stumped. We have some duplicate content issues in title tags, perhaps not following best practice in the 301 redirects but other than that I dont see any major on page issues, unless I am missing something in the seriousness of what I have listed.
Finally we have also do a bit of a cull of poor quality links, requesting links to be removed and also submitting a 'disavow' of some really bad links. We do not have a manual penalty though. Thoughts, feedback or comments VERY welcome.0 -
What should I block with a robots.txt file?
Hi Mozzers, We're having a hard time getting our site indexed, and I have a feeling my dev team may be blocking too much of our site via our robots.txt file. They say they have disallowed php and smarty files. Is there any harm in allowing these pages? Thanks!
Intermediate & Advanced SEO | | Travis-W1 -
HTTP Errors in Webmaster Tools
We recently added a 301 redirect from our non-www domain to the www version. As a result, we now have tons of HTTP errors (403s to be exact) in Webmaster Tools. They're all from over a month ago, but they still show up. How can we fix this?
Intermediate & Advanced SEO | | kylesuss0 -
Block Google Sitelinks for DSEO?
I am trying to manage DSEO for a client. The question is: would blocking a page listing from my client's Google Sitelinks cause that blocked sitelink page to be independently listed in the rankings and therefore potentially stuff a negative listing further down the rankings? Or would the blocked sitelink not show up at all in the SERPs
Intermediate & Advanced SEO | | bcmull0