Why is my servers ip address showing up in Webmaster Tools?
-
In links to my site in Google Webmaster Tools I am showing over 28,000 links from an ip address. The ip address is the address that my server is hosted on. For example it shows 200.100.100.100/help, almost like there are two copies of my site, one under the domain name and one under the ip address. Is this bad? Or is it just showing up there and Google knows that it is the same since the ip and domain are from the same server?
-
Hmmm, this is a weird one. My guess is, since Google originally found those links (maybe before your site launched, but the pages were linked to and live through the IP address?), it keeps returning to them and finding them. In that case, not much you can do, but keep those canonicals on.
Canonicals really can save you from duplicate content problems: I've had clients with multiple versions of every page based on the path you take to a page, and canonicals have allowed them to rank well and avoid penalties entirely. As long as you're doing everything else right, hopefully this shouldn't be too much of an issue.
Sorry this ended up falling on you!
-
According to my latest links in Webmaster Tools the first time it happened was October 2012, which is before the site launch. It seems to have accelerated this year. It is a total of 16341 links but under linked pages it only says 27.
-
Hm, this could have, though. When did you first notice these backlinks from the IP address in GWT?
-
I am unsure to be honest. We had an organic traffic drop in 2012 the week of the penguin release. We launched a new site last year which killed organic so I am trying to improve our rankings. I can say confidently we have had nothing in Webmaster Tools, but maybe it has hurt traffic.
-
Well, from an SEO perspective, this hasn't lead to any penalties or reduced rankings, right?
-
Recently we switched to https so I started using self-referential rel="canonical" on all my pages. I can't figure this out, and nobody else can either. I am on all sorts of boards, forums, groups, and nobody has ever heard of this. I just don't get it.
-
Did you add canonicals, at least, to make sure that Google wouldn't find duplicate content? That's what I'd be most worried about, from an SEO perspective.
-
I never solved the problem. I made a new post to see if anything has changed. It seems strange that nobody else has ever had this problem. I looked all over Google and nothing. I just ran Screaming Frog and nothing showed up.
-
How is this going? Did you solve the problem?
One quick note: if you can't find a link to the IP address on your site (or, a link to a broken link or an old domain), run a Screaming Frog or Xenu crawl and look at all external links. There's probably a surprise footer link or something like that that's causing the problem, and it'd be easy to miss manually. But tools find all!
Good luck.
-
Yeah it's generally a DNS setup. If you're hosting with a company the best thing to do is open a ticket and get them to walk through it with you. Most providers will have their own admin panels.
-
I have looked and can't find anything in the site that goes from ip. I have looked in Webmaster Tools and it doesn't show any duplicate content. We are on a Windows server, think it would be pretty easy to redirect the ip to the domain?
-
There might be a link or something directing the crawlers to your site's IP address instead of the original domain. There is potential for getting flagged with duplicate content but I feel it's fairly unlikely. You do want to fix this though, it would hamper your backlink efforts. These steps will correct this issue.
1. Setup canonical tags on all your pages. This lets Google know that 1 url should be linked for this page whether they're on the IP or domain.
2. Set your host up so that anything that directs to the IP is automatically redirected to the domain. This can be done with your hosting company, or through .htaccess, or through PHP. I suggest you do it with the hosting company.
3. Check through your site and make sure no links point to the IP domain. If there are no links pointing to the IP, the crawler shouldn't follow.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tools to scan entire site for duplicate content?
HI guys, Just wondering if anyone knows of any tools to scan a site for duplicate content (with other sites on the web). Looking to quickly identify product pages containing duplicate content/duplicate product descriptions for E-commerce based websites. I know copy scape can which can check up to 10,000 pages in a single operation with Batch Search. But just wondering if there is anything else on the market i should consider looking at? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
SEO Tools for Content Audit
Hi i'm looking for a tool which can do a full content audit for a site for instance - Find pages which: • Lack text content. • Finds pages with lengthy meta descriptions • Finds missing H1 tags or multiple H1 tags . • Duplicate meta descriptions. • Find images with no alt text Are there any tools besides the ones on SEMOZ which can enable me to do a full content audit on factors like these. Or any SEO audit tools out there which you can recommend. Cheers, Mark
Intermediate & Advanced SEO | | monster990 -
Redirect a temporary IP
I was performing some development work on a client's site recently under a temporary location on the host's server, for example: http://11.22.33.444/~accountname/folder/page.html Google managed to index a couple of pages using this url 😞 I have updated DNS to the correct domain and the site is live, but I am a bit confused in regards to the correct way to create a 301 Redirect for this example or at least a way point it to our 404 page. I am hoping someone more proficient with htaccess can help me out a bit... Thanks!
Intermediate & Advanced SEO | | SCW0 -
Should I report unnatural links via Webmasters?
We have a client who fired their last SEO firm for backlinking. The company has the actual emails and evidence that it found. On July 19, 2012, they received a notice in Webmasters that "unnatural links" had been detected to their site. The notice states that they should request reinclusion, but Matt Cutts is saying something different: https://plus.google.com/u/3/109412257237874861202/posts/gik49G9c5LU My client wants to ensure that they are NOT impacted, so should they notify Google anyways? The notice in Webmasters reads: Dear site owner or webmaster of…. We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team
Intermediate & Advanced SEO | | dknewmedia0 -
Server cache and SEO
I have a question about server cache and seo. For example. www.chanel.com.cn , the server is in US, and uses China Cache to improve local Chinese users access speed, so what do you think this way will work for search engines spiders too? when a spider is crawlling the website, does the content it crawl on US server or China cache? what's best practice for those kind of SEO on server side? thanks Boson
Intermediate & Advanced SEO | | topchinaseo0 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0 -
Google Webmasters not Accurate
I recently updated all the Meta titles, descriptions and keywords on my website because in the past most were duplicate and/or written in the incorrect language. According to Webmaster Tools they have indexed our site post update, but we still have the same number of HTML issues. When I click to investigate the issues further it is clear they are reflecting the old Meta not the new stuff we just added. Should this fix itself the next time Google crawls my site or is there something else I should be doing about the issue? Thanks!
Intermediate & Advanced SEO | | theLotter0