Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why is my servers ip address showing up in Webmaster Tools?
-
In links to my site in Google Webmaster Tools I am showing over 28,000 links from an ip address. The ip address is the address that my server is hosted on. For example it shows 200.100.100.100/help, almost like there are two copies of my site, one under the domain name and one under the ip address. Is this bad? Or is it just showing up there and Google knows that it is the same since the ip and domain are from the same server?
-
Hmmm, this is a weird one. My guess is, since Google originally found those links (maybe before your site launched, but the pages were linked to and live through the IP address?), it keeps returning to them and finding them. In that case, not much you can do, but keep those canonicals on.
Canonicals really can save you from duplicate content problems: I've had clients with multiple versions of every page based on the path you take to a page, and canonicals have allowed them to rank well and avoid penalties entirely. As long as you're doing everything else right, hopefully this shouldn't be too much of an issue.
Sorry this ended up falling on you!
-
According to my latest links in Webmaster Tools the first time it happened was October 2012, which is before the site launch. It seems to have accelerated this year. It is a total of 16341 links but under linked pages it only says 27.
-
Hm, this could have, though. When did you first notice these backlinks from the IP address in GWT?
-
I am unsure to be honest. We had an organic traffic drop in 2012 the week of the penguin release. We launched a new site last year which killed organic so I am trying to improve our rankings. I can say confidently we have had nothing in Webmaster Tools, but maybe it has hurt traffic.
-
Well, from an SEO perspective, this hasn't lead to any penalties or reduced rankings, right?
-
Recently we switched to https so I started using self-referential rel="canonical" on all my pages. I can't figure this out, and nobody else can either. I am on all sorts of boards, forums, groups, and nobody has ever heard of this. I just don't get it.
-
Did you add canonicals, at least, to make sure that Google wouldn't find duplicate content? That's what I'd be most worried about, from an SEO perspective.
-
I never solved the problem. I made a new post to see if anything has changed. It seems strange that nobody else has ever had this problem. I looked all over Google and nothing. I just ran Screaming Frog and nothing showed up.
-
How is this going? Did you solve the problem?
One quick note: if you can't find a link to the IP address on your site (or, a link to a broken link or an old domain), run a Screaming Frog or Xenu crawl and look at all external links. There's probably a surprise footer link or something like that that's causing the problem, and it'd be easy to miss manually. But tools find all!
Good luck.
-
Yeah it's generally a DNS setup. If you're hosting with a company the best thing to do is open a ticket and get them to walk through it with you. Most providers will have their own admin panels.
-
I have looked and can't find anything in the site that goes from ip. I have looked in Webmaster Tools and it doesn't show any duplicate content. We are on a Windows server, think it would be pretty easy to redirect the ip to the domain?
-
There might be a link or something directing the crawlers to your site's IP address instead of the original domain. There is potential for getting flagged with duplicate content but I feel it's fairly unlikely. You do want to fix this though, it would hamper your backlink efforts. These steps will correct this issue.
1. Setup canonical tags on all your pages. This lets Google know that 1 url should be linked for this page whether they're on the IP or domain.
2. Set your host up so that anything that directs to the IP is automatically redirected to the domain. This can be done with your hosting company, or through .htaccess, or through PHP. I suggest you do it with the hosting company.
3. Check through your site and make sure no links point to the IP domain. If there are no links pointing to the IP, the crawler shouldn't follow.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What IP Address does Googlebot use to read your site when coming from an external backlink?
Hi All, I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink. I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website? E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address. Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain? Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc. If anyone has any insight this would be great.
Intermediate & Advanced SEO | | MattBassos0 -
Cached version of my site is not showing content?
Hi mozzers, I am a bit worried since I looked a cache version of my site and somehow content is partially showing up and navigation has completely disappeared. Where could this come from? What should I be doing? Thanks!
Intermediate & Advanced SEO | | Taysir0 -
Wrong meta descriptions showing in the SERPS
We recently launched a new site on https, and I'm seeing a few errors in the SERPS with our meta descriptions as our pages are starting to get indexed. We have the correct meta data in our code but it's being output in Google differently. Example: http://imgur.com/ybqxmqg Is this just a glitch on Google's side or is there an obvious issue anyone sees that I'm missing? Thanks guys!
Intermediate & Advanced SEO | | Brian_Owens_10 -
301 Redirect Showing Up as Thousands Of Backlinks?
Hi Everyone, I'm currently doing quite a large back link audit on my company's website and there's one thing that's bugging me. Our website used to be split into two domains for separate areas of the business but since we have merged them together into one domain and have 301 redirected the old domain the the main one. But now, both GWT and Majestic are telling me that I've got 12,000 backlinks from that domain? This domain didn't even have 12,000 pages when it was live and I only did specific 301 redirects (ie. for specific URL's and not an overall domain level 301 redirect) for about 50 of the URL's with all the rest being redirected to the homepage. Therefore I'm quite confused about why its showing up as so many backlinks - Old redirects I've done don't usually show as a backlink at all. UPDATE: I've got some more info on the specific back links. But now my question is - is having this many backlinks/redirects from a single domain going to be viewed negatively in Google's eyes? I'm currently doing a reconsideration request and would look to try and fix this issue if having so many backlinks from a single domain would be against Google's guidelines. Does anybody have any ideas? Probably somthing very obvious. Thanks! Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
How do I add https version of site to Bing webmaster tools?
I could add my site to Google Webmaster tools with no problems, but when I try to add it in Bing webmaster tools it just redirects me to what I already have. Everything is staying the same but the switch from http to https. Anyone else experienced this? This is what I just received back from Bing and it doesn't seem right- I understand that you switched to the https version of your site and you're now trying to use the Site Move tool. However, in order to do this, you must verify the https version of your site first. You cannot do this because it just redirects you to the dashboard. We thank you for reporting this to us. We've investigated on this matter and can see that you're already put a redirect from the http to the https version of your site. We also checked the /BingSiteAuth.xml file and this also redirects to the https version. At this point, we suggest that you remove the current website (http version) that is verified through Bing Webmaster Tool and add your https domain. When done, use the Site Move tool. Thoughts?
Intermediate & Advanced SEO | | EcommerceSite1 -
How to add subdomains to webmaster tools?
Can anyone help with how I add a sub domain to webmaster tools? Also do I need to create a seperate sitemap for each sub domain? Any help appreciated!
Intermediate & Advanced SEO | | SamCUK1 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Getting a Sitemap for a Subdomain into Webmaster Tools
We have a subdomain that is a Wordpress blog, and it takes days, sometimes weeks for most posts to be indexed. We are using the Yoast plugin for SEO, which creates the sitemap.xml file. The problem is that the sitemap.xml file is located at blog.gallerydirect.com/sitemap.xml, and Webmaster Tools will only allow the insertion of the sitemap as a directory under the gallerydirect.com account. Right now, we have the sitemap listed in the robots.txt file, but I really don't know if Google is finding and parsing the sitemap. As far as I can tell, I have three options, and I'd like to get thoughts on which of the three options is the best choice (that is, unless there's an option I haven't thought of): 1. Create a separate Webmaster Tools account for the blog 2. Copy the blog's sitemap.xml file from blog.gallerydirect.com/sitemap.xml to the main web server and list it as something like gallerydirect.com/blogsitemap.xml, then notify Webmaster Tools of the new sitemap on the galllerydirect.com account 3. Do an .htaccess redirect on the blog server, such as RewriteRule ^sitemap.xml http://gallerydirect.com/blogsitemap_index.xml Then notify Webmaster Tools of the new blog sitemap in the gallerydirect.com account. Suggestions on what would be the best approach to be sure that Google is finding and indexing the blog ASAP?
Intermediate & Advanced SEO | | sbaylor0