What is best practice to eliminate my IP addr content from showing in SERPs?
-
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center).
Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that).
I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows.
As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical.
So here are my questions.
-
Is there a better way?
-
If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
-
-
I would allow Google to crawl those pages for a little while longer just to ensure that they see the rel canonical tags. Then once you feel that they have recrawled the IP address pages you can disallow them again if you want, thought that isn't entirely necessary if you have the rel canonical tag set up properly.
Another option would be to 301 redirect the IP version of the page to the corresponding www. version.
If they still don't drop from the index you can use the URL Removal Tool in GWT, but you will have to set up a GWT account for each of the IP domains.
-
Thanks. Any suggestions on how to get Google to drop these pages (make them inactive)?
-
Hi,
Since doing the disallow on the IP address sites, they are no longer getting crawled.
** The disavow list won't stop google crawl those domain / pages. Google will just treat those links as no follow - so they won't pass Page Rank.
You will still see those in Web master tools, the links will still be active.
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Thanks!
-
Thanks. We are getting large daily crawls (nearly 100k a day) so fingers crossed this will sort it out soon.
-
Hi,
The canonical solution should be enough however I would still build some xml sitemaps and submit those via Web master Tools to speed the process. You can also build some html sitemaps with a clear structure and add those in the footer - again, to speed up the proces a little bit.
If you split the content into multiple xml sitemaps you can also track the crawling process.
You should also check your crawling speed in Web Master Tools to see how many pages in avarage the google bot is hitting each day - based on those numbers you can run some prediction on how long it will take more or less for google to re crawl your pages.
If your numbers is "bad" you will need to improve it some how to help with process - it can do wonders...
Hope it helps.
-
The canonical solution you have implemented is perfect. If you have decent authority and get deep crawls every couple days, you should be fine and pages from your IP should start to disappear shortly.
I would not worry about it anymore. You are on the right track. Sit back, relax and enjoy your flight
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best practice for acquisition?
Hi, My company have just bought out a competitor. We wan't to dissolve their website and if possible steal some of their link juice. The site hasn't got any spammy links or 404's so i'm not worried in that department. What I am not sure about is which of the following is best practice? a. Redirect every single page (even pages like /?checkout) to a relevant page on our website. b. Only redirect important pages, category pages, contact pages etc and leave the other pages to 404? c. Redirect the important pages to a relevant URL and redirect the less important pages to our homepage. d. Redirect the entire domain to our home page (i assume this isn't a good idea) e. Don't redirect any of the pages just delete the site.
Intermediate & Advanced SEO | | DannyHoodless0 -
What are the best practices for microdata?
Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.
Intermediate & Advanced SEO | | WebElaine0 -
When does it make sense to make a meta description longer than what's considered best practice?
I've seen all the length recommendations and understand the reasoning is that they will be cut off when you search the time but I've also noticed that Google will "move" the meta description if the search term that the user is using is in the cached version of the page. S I have a case where Google is indexing the pages but not caching the content (at least not yet). So we see the meta description just fine on the Google results but we can't see the content cache when checking the Google cached version. **My question is: **In this case, why would it be a bad idea to make a slightly lengthier (but still relevant) meta description with the intent that one of the terms in that description could match the user's search terms and the description would "move" to highlight that term in the results.
Intermediate & Advanced SEO | | navidash0 -
Website (.BE) showing up in .NL SERPS
Fellow mozzers, we need your help We have a situation where a customer has two websites for each country: flowtracksurf.be → Belgium flowtracksurf.nl → Netherlands They used to have very good keyword rankings in the SERPS in BE & NL. Flowtracksurf.nl had good rankings in Google.nl and Flowtracksurf.be in Google.be.
Intermediate & Advanced SEO | | Jacobe
Recently there has been a change: Flowtracksurf.nl is not showing up in Google.nl anymore. It also seems that all the rankings from flowtracksurf.nl have been switched to flowtracksurf.be. .BE is doing very well, .NL is suffering. Data shows us that .NL : In the first two weeks of december 2014, we see a massive drop in traffic (GA) In that same week(s) we see a drop in search queries (Webmaster Tools) We see the exact opposite in .BE (growing strong in those weeks) When we look at the cache of flowtracksurf.nl we see only reference to flowtracksurf.be. Is that a hint of what was going on? On the same date that we see a massive drop in traffic on .NL, we see a peak in 'indexation' of .BE We see that the MOZ pages crawled dropped in that same week for NL We're also seeing that all the traffic from Google.nl is now going to flowtracksurf.be. Some keywords we were scoring #1-2 for are: surfvakanties, surfvakantie, surfcamp mimizan, surfcamp, frankrijk, surfcamp spanje, surfen frankrijk We just can't figure out the hard evidence in the data.
Can you help us on that?0 -
Is tabbed content bad for SEO?
I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!
Intermediate & Advanced SEO | | TheaterMania0 -
Best linking practice for international domains
SEOMoz team, I am wondering that in the days of Panda and Penguin SEOs have an opinion on how to best link between international domains for a web page property. Let's say you have brandname.DE (German site) brandname.FR (French site) brandname.CO.UK (British site) Right now we are linking form each site on the page to the other two language sites to make users aware of the translated version of the site which obviously make it a site wide link which seems to be lately disencouraged by Google. Did anyone out there have any ideas how to strategically interlink between international domains that represent language versions of a web site? /PP
Intermediate & Advanced SEO | | tomypro0 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0