Webiste Ranking Differently Based on IP/Data Center
-
I have a site which I thought was ranking well, however that doesn't seem to be the case.
When I check the site from different IPs within the US it shows that the site is on page 1 and on other IPs it shows that it's on page 5 and for some keywords it shows it's not listed.
This site was ranking well, before but I think google dropped it when I was giving putting in too much work with it (articles and press releases), but now it seems to have recovered when I check with my IP, but on other data centers it still shows it prior to recovering.
It was able to recover after not building links to for a period of time, it showed it moved back up from the data center I'm connected to, but it still shows the possibly penalized results on other data centers.
Is it possible that site is still penalized?
So the question is why does it show it recovered in some data centers and not others?
How do I fix this?
It's been about 2 months since it's recovered from some data centers. Is this site still penalized or what's going on?
There are no warnings in web master tools.
Any insights would be appreciated!
This isn't an issue with the rank tracking software, I've tested this on a multitude of IPs with varying differences.
Thanks!
-
Linkdex and also authority labs now offer you the opportunity to see different georankings.
Said that, none seems having answered about why a site ranks differently in every location.
The answer is quite simple: personalization and geotargeting of the users. That means that Google tends to present to the users sites that it considers will be more useful also from a geographical perspective.
This is blatantly evident in case of hotel or local businesses, but it is applied also to site which, at first, may seem not having a real dependency over geotargeting. Why? Let's take SEO companies. If you are doing a search for "SEO Services" a neutral search will show you this, but if I do a not neutral search, Google start showing to me, who I live in Spain, SEO sites from the UK, which are maybe more interesting for me as they are closer to me geographically and about in the same timezone.
-
Hi A A,
I generally use GeoRanker.com to get this information. It's got a fairly generous full-featured free version, and lets you target countries and cities and compare the first page results. It's quite useful for comparing differing results between cities.
If you're looking for multiple keyword tracking in the top 100, your best best is to use something like Rank Tracker or AWR and use proxies located in the same country as the ccTLD you're searching (i.e. use UK proxies for google.co.uk). You can automate this to an extent in Rank Tracker (in Preferences > Proxy Rotation).
Hope that helps!
-
most of my clients are in different cities, but I've never seen the rankings vary as much as this.
Just wondering if there was an answer to this or if I should suggest to start over again, with the client if the site is penalized, he won't be happy with this for sure.
Also wondering if there was a way to see if the site is penalized, but it makes no sense for it to be penalized in some data centers but not others.
Also it's very frustrating dealing with clients all the time as well, so you may have the ideal position working inhouse
-
Thanks AA
Yes, this explains a lot. I am an in house SEO in the USA. I don't have a need (yet) for this kind of thing. Quite frankly, if I got to the point where I really needed this, I might consider handing off the client to someone in a more relevant local or out-sourcing to someone in their local.
I think the ability to be able to check via tools like HMA has a limited horizon and that one can't build a business that relies on the ability to check rankings in this manner. Just my two cents, and I'm certainly not saying I am an authority on this particular subject.
the topic has been fascinating to me!
-
Hi Dana and JDP,
I use hide my ass (aff link removed by staff) has pretty good service checking from different locations.
I needed this because I have clients in different location so I needed to find proxies within their location, and found out that the results they see are drastically different from my own and also from different IP addresses in this situation. I have others where the results vary but not by the same margin.
Any help or ideas of what I should do would be appreciated. Thank you!
-
Hello A A,
Your post got me paranoid and I went around to all the different online tools that will check rankings from different data centers like http://dccheck.com/ . I got a wide variety of responses from these tools and it made me wonder if they worked at all. I also don't have any hard answers for you and it sounds like you may have had some trouble in the past;however it is possible you are freaking out about nothing. Is your traffic stable? You could check google analytics and filter for search traffic then your keywords. From there go and check the geographic map the traffic is coming from. Compare a current time period this way to a past time period when you know you were ranking well. It it is not too different, maybe there is nothing to worry about. Does that help?
-
Hi A A, I'm sorry I am not posting with a really good answer to your question, but instead I have questions. How are you checking your rankings from multiple data centers? I have been doing SEO professionally for a long time (8 years), but there's certainly a lot I don't know.
How does one check one's rankings from a multitude of data centers? Is this really necessary in the US?
I understand that different regions might show different results even to Google users who aren't logged in to Google, based on their geographic location. But, I haven't ever heard of drastic changes in rankings from one data center to another unless it was for a location-specific Website.
Sorry to answer your question with a question, but I am just really curious about this. Thanks!
Dana
P.S. I am currently using SEOMoz Pro, WebPosition.com, Rank Tracker (sporadically) and AWR....but am not aware that you can set any of these up to search from different data centers?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google suddenly stops ranking a page for a "keyword" with same "keyword" in title tag. Low competition.
Hi all, We have released our next version of product called like "software 11", which have thousands of searches every month. So we have just added this same keyword "software 11" as page title suffix to one of the top ranking pages. Obviously this is the page has been added suddenly with "software 11" at page title, multiple header tags and 1 mention in paragraph. Google ranked it for 2 days and suddenly stopped showing this page in entire results for the same keyword we optimised the page for. Why does it happened? Does Google think that we are overdoing with this page and ignoring it? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
New Service/Product SEO and rankings
Hello, fellow MOZers. We are a web design company, and we had SEO as secondary service for years. Due to changes in the company we started pushing SEO as one of our main services about 6 monhs ago. We have separate page , targeting that service, as well as case studies, supportive information pages, even SEO Center, which is like a blog about SEO only. We are not using black hat SEO, doing honest link earning and building, don't use keyword stuffing, everything is by the book. I understand that SEO takes time, especially for a company which has a footprint as web design company, not as SEO company. We are ranking very good for web design related keyphrases, however, we don't see any improvements for SEO related keywords. It always was and is between 25-30 SERP. At the same time, competitors, who are ranking on first page for SEO related phrases are pretty bad looking. Design-wise as well as blackhat-SEO-wise. Everything is keyword stuffed, UX is horrible, prices are ridiculous. So, do you guys have any thought/advise on how we can see results / why we are not seeing results. Links: Google search result: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=seo%20houston Competitors: www.seohouston.com, www.graphicsbycindy.com Our pages: https://www.hyperlinksmedia.com/seo-houston.php, https://www.hyperlinksmedia.com/seo-houston/
White Hat / Black Hat SEO | | seomozinator0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
What is the difference between the two rewrite rules in htaccess?
Force www. prefix in URLs and redirect non-www to www RewriteCond %{HTTP_HOST} !^www.domain.com.ph
White Hat / Black Hat SEO | | esiow2013
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L] Force www. prefix in URLs and redirect non-www to www - 2nd option RewriteCond %{HTTP_HOST} ^domain.com.ph [NC]
RewriteRule (.*) http://www.domain.com.ph/$1 [R=301,L]0 -
Blog on 2 domains (.org/.com), Canonical to Solve?
I have a client that has moved a large majority of content to their .org domain, including the blog. This is causing some issues for the .com domain. I want to retain the blog on the .org and have it's content also show on the .com. I would place the canonical tag on the .com Is this possible? Is this recommended?
White Hat / Black Hat SEO | | Ngst0 -
Blogger Reviews w/ Links - Considered a Paid Link?
As part of my daily routine, I checked out inbound.org and stumbled upon an article about Grey Hat SEO techniques. One of the techniques mentioned was sending product to a blogger for review. My question is whether these types of links are really considered paid links. Why shouldn't an e-commerce company evangelize its product by sending to bloggers whose readership is the demographic the company is trying to target? In pre e-commerce marketing, it was very typical for a start-up company to send samples for review. Additionally, as far as flow of commerce is concerned, it makes sense for a product review to direct the reader to the company, whether by including a contact phone number, a mailing address, or in today's e-commerce world, a link to their website. I understand the gaming potential here (as with most SEO techniques, black-hat is usually an extreme implementation), but backlinks from honest product reviews shouldn't have a tinge of black, thus keeping it white-hat. Am I wrong here? Are these types of links really grey? Any help or insight is much appreciated!
White Hat / Black Hat SEO | | b40040400 -
Powered by/Credit backlinks and nofollow
Pseudo question: I have a website that has 100K pages. On about 50K of those pages I have information that is fed to me via an outside 3rd-party website. Now, I like to give credit where credit is due, so I add a backlink to the website that is feeding me this content. A simple backlink like so: Information provided by: Company ABC Now, this 3rd-party website wants me to remove the nofollow tags from the backlink, but I am very, very skeptical because to me, sending ~50K dofollow backlinks to a single site might make the Google monster upset with me. This 3rd-party site is being very hard-headed about this, to the point where I am thinking of terminating the relationship all together. I digress. Scoured the net before writing this, but couldn't really find anything directly related to my issue. Thoughts? Is a nofollow required here? We're not talking 1 or 2 links here; we're talking tens of thousands (50K is low; it will probably be upwards of 100K when all is said and done as my site has many, many pages). Thanks in advance.
White Hat / Black Hat SEO | | THB0 -
Difference between Syndication, Autoblogging, and Article Marketing
Rands slide deck titled 10 Steps to Effective SEO & Rankings from InfusionCon2011 on slide 82 recommends content syndication as a method for building traffic and links. How is this any different than article marketing? He gave an example of this using a screenshot of this search result for "headsmacking tip discussion." All of those sites that have republished SEOmoz's content are essentially autoblogs that post ONLY content generated by other people for the purpose of generating ad clicks from their organic traffic. We know that Google has clearly taken a position against these types of sites that offer no value. We hear Matt Cutts say to stay away from article marketing because you're just creating lots of duplicate content. Seems to me that "syndication" is just another form of article marketing that spreads duplicate content throughout the web. Can someone help me understand the difference? By the way, the most interesting one I saw in those results was the syndicated article on businessweek.com!.
White Hat / Black Hat SEO | | summitseo0