If Google doesn’t know we’re hosted in the UK, does that affect our SERPs?
-
Hi,
In November 2011 our eCommerce website dropped from between 3rd and 4th position in the UK SERPs down to 7th and 8th. A year after this happened, we still haven’t moved back up to the original ranking despite all our best efforts and we’re looking for a bit of insight into what could have happened. One of our theories is this, do you think it might be the problem?
In October 2011 we moved from a single-site custom built CMS hosted in the UK to a multi-site custom built CMS hosted on a much better server based in the UK. As part of this move we started using CloudFlare to help with security and performance (CloudFlare is a security CDN). Because CloudFlare’s servers are in the US, to the outside world it almost looks like we went from a slow hosting company in the UK to a much quicker hosting company in the US.
Could this have affected our rankings? We know that Google takes the server IP address into account as a ranking factor, but as far as we understand it’s because they (rightly) believe that a server closer to the user will perform better. So a UK server will serve up pages quicker to a visitor in the UK than a US server because the data has a shorter distance to travel.
However, we’re definitely not experiencing an issue with being recognised as a UK website. We have a .co.uk domain (which is obviously a big indicator) and if you click on “Pages from the UK” in the SERPs we jump up to 3rd place. So Google seems to know we’re a UK site.
Is the fact we’re using CloudFlare and hence hiding our real server IP address – is this penalising us in the SERPs?
Currently out of the 6 websites above us, 4 are in the US and 2 are in the UK. All of these are massive sites with lots of links, so smaller ranking factors might be more important for us. Obviously the big downside of not using CloudFlare is that our site becomes much less secure and it becomes much slower. Images and some static content is distributed via a local CloudFlare server, which means it should tick Google’s box in terms of providing a quick site for users.
CloudFlare say in a blog post that they used to have Google crawl rates and geo-tagging issues in the past when they were just starting out, but in 2010 they started working with “the big search engines” to make sure they treated CloudFlare like a CDN (so special rules that apply to Akamai also apply to CloudFlare). Since they’ve been working with Google, CloudFlare say that their customers will only see a positive SEO impact.
So at the moment we’re at a loss about what happened to our ranking. Google say they take IP’s into account for ranking, but by using CloudFlare it looks like we’re in the US. We definitely know we’re not having geo-tagging issues and CloudFlare say they’re working with Google to ensure its customers aren't seeing a negative impact by using CloudFlare, but a niggling part of us still wonders whether it could impact our SEO.
Many thanks, James
-
Hi Des, Thanks, I didn’t know that’s why we were assigned a “special crawl rate” in WMT. Could the special crawl affect our ranking? For example Google puts a lot of weight on freshness, so if Google is crawling us less (we can’t tell if it’s more or less than before), could this make our site look less fresh? We have really tried our best to rule out all other possibilities. Our content is much better and more frequent than it was before and our link building is natural and gradual. We’ve also looked at over optimisation and our competitors. Our competitors are Wikipedia, a couple of national UK newspapers, Harvard, a medical encyclopaedia and a single American competitor. We’re the first UK company to appear the in the SERPs. Whilst these are obviously very big companies, none of them (with the exception of the American company) targets the keyword as much as our website does. Incidentally we did come back up to 4th yesterday but we’ve already a dropped a place today so it doesn’t look like it’ll last. The other thing we found really strange is that the singular version of our keyword didn’t drop at all and has stayed very stable; it’s just in the plural keyword that we dropped. The vast majority of our anchor text is using the plural version (it’s in our brand name) and the domain also contains the plural version. Was there an algorithm change around that time, or maybe are we over optimising the plural keyword? (Is that even possible?) Thanks James
-
Hi SEO5, Thanks for your response. I had come across that forum post before, which incidentally led me to CloudFlare's article about how they work with Google. Maybe I wasn't being explicit enough in my question. We definitely know that Google prefers a faster website and that they have special rules for CDN's. So just to clarify, we were just wondering if there is any way that could using CloudFlare (and therefore not making it clear we’re hosted in the UK) negatively affect our rankings? We’re specifically looking at the UK SERPs ,rather than the US SERPs. Also could you clarify with what you mean with server change? The new server is faster and more reliable, but are there other factors than server speed and server location that Google take into account? We’ve also looked at over optimisation and our competitors. Our competitors are Wikipedia, a couple of national UK newspapers, Harvard and a medical encyclopaedia and a single American competitor. We’re the first UK company to appear the in the SERPs. Whilst these are obviously very big companies, none of them (with the exception of the American company) targetx the keyword as much as our website does. Thanks, James
-
1. Cloud flare is a distributed service with edge servers in many countries around the world. These servers will send your website content from the edge. Users in the uk will most likely be served from a uk server. 2. Google is well clued up on cloud flare and will as you said know that it is dealing with a cdn. Hence in webmaster tools you will not be able to adjust the google crawl rate. 3. In google webmaster tools you can set your desire location, tld that you are targeting. In your case .co.uk. This will tell google I am interested in .co.uk based queries. 4. You should not rule out other possibilities. As sherlock says "when you have eliminated the impossible, whatever remains, however improbable, must be the truth? "
-
Hi James,
The loss of ranking could be attributed to various factors:
- Over optimization penalty
- The changing of the servers
- Competitors outranking you with more quality and aggressive SEO
If you do have a co.uk domain then the changing of hosting from the US to UK could have impacted the rankings.
Here's a link to a discussion on Cloud fare's impact on SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google inaccurate results: Common or error?
Hi all, While searching for our primary keyword, I can see 2 websites on second page results which are non-related to the keyword or industry but their company name is this keyword. Like if I want to rank and searching for "SEO", there are 2 websites which called "seo trucks" and "seo paints". I wonder how Google is ranking these websites for high competition keyword with 1 million searches per month. So the keyword in URL and this keyword mentioned across the website being their brand name taking over the other potential ranking factors like backlinks, relevant content, user clicks, etc..... Thanks
Algorithm Updates | | vtmoz0 -
Where has Google found the £1.00 value for the penny black? Is it Google moving beyond the mark-ups too?
Hi guys, I am curious, so am wondering something about the Penny Black SERPs.
Algorithm Updates | | madcow78
Apparently Google shows a value of £1.00 Penny Black SERP From where does it come from? It's not the value Penny Black Value SERP The Wikipedia page hasn't any mark-up about it, actually it has the Price value mark-up of 1 penny Penny Black Wiki Markup Among the rare stamps, also the Inverted Jenny shows a value Inverted Jenny SERP But it's clearly taken from USPS and it's the cost of a new version of this rare stamp USPS Inverted Jenny Indeed, the mark-up matches that value USPS Inverted Jenny Mark-up I've been looking on-line for a new version of the Penny Black, but couldn't find anything.
The only small piece of information that I've found to correlate one pound with the Penny Black is on the Wikipedia page, but the point is: is Google able to strip those information from that piece? It's not a mark-up, it's not a number and mostly it's not a simple sentence like "The penny black cost was of £1.00" It reads "One full sheet cost 240 pennies or one pound sterling". Penny Black Wikipedia particular Is it Google moving beyond the mark-ups too? Thanks, Pierpaolo 9Cm3MOs.jpg f7XYNtF.jpg 5PpwapB.jpg hYUJswI.jpg 7kbIC4Q.jpg jnu1Gbe.jpg Wzltg0t.jpg2 -
Anyone Notice Google's Latest Change Seems to Favor Google Books?
I've noticed a change in the search results lately. As I search around I notice a lot of results from books.google.com Seems a little (ok a lot) self serving... JMHO
Algorithm Updates | | get4it1 -
Removing an old Google places listing for a newer version?
Hey there, I was wondering whether you could help me out on the following; One of our clients has a Google places listing that we created for their business but it appears to be being blocked - or at least conflicting - with an old listing. As such, Google appears to be showing the old listing with an outdated URL and company name - rather than the new one. Does anyone know how I can go about removing this listing or showing that the newer one is now more relevant? Unfortunately, I don't have the logins for the old places listing. Old listing; https://plus.google.com/105224923085379238289 New listing; https://plus.google.com/b/114641937407677713536/114641937407677713536
Algorithm Updates | | Webrevolve0 -
SERP Drop
Hi, I have been trading online since 2006 and over the years I have built up some impressive SERP's for keywords such as "mens underwear' which I was SERP 1 for. However, over the past 6 months I have pretty much dropped off the face of Google for a large proportion of my keywords. I suspect I have been hit by the Panda/Penguin updates and do not know how to recover this. I have a mixture of what I consider to be relevant and healthy links, but there are also a few links in there that Google would no longer like. However, I believe that the majority of my links are OK. What should I do? Thanks i97zo6W.jpg
Algorithm Updates | | UnderMe0 -
How To Rank High In Google Places?
Hello SEOmoz, This question has been hounding me for a long time and I've never seen a single reliable information from the web that answers it. Anyway here's my question; Supposing that there are three Google places for three different websites having the same categories and almost same keywords and same district/city/IP how does Google rank one high from the other? Or simply put if you own one of those websites and you would want to rank higher over your competitors in Google places Search results how does one do it? A number of theories were brought up by some of my colleagues: 1. The age of the listing 2. The number of links pointing to the listing (supposing that one can build links to ones listing) 3. The name/url of the listing, tags, description, etc. 4. The address of the listing. 5. Authority of the domain (linked website) You see some listings have either no description, and only one category and yet they rank number one for a specific term/keyword whereas others have complete categories, descriptions etc. If you could please give me a definite answer I will surely appreciate it. Thank you very much and more power!
Algorithm Updates | | LeeAnn300 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0 -
Google +1 link on Domain or Page?
Since its release, I've seen Google +1 being used across an entire domain but only reference the root href in the code snippet. At the same time, you see other sites use +1 more naturally with the button being specific to the page you're on. What's your take on this? To clarfiy, do you add: or .. on each page.
Algorithm Updates | | noeltock0