What is best practice to eliminate my IP addr content from showing in SERPs?
-
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center).
Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that).
I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows.
As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical.
So here are my questions.
-
Is there a better way?
-
If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
-
-
I would allow Google to crawl those pages for a little while longer just to ensure that they see the rel canonical tags. Then once you feel that they have recrawled the IP address pages you can disallow them again if you want, thought that isn't entirely necessary if you have the rel canonical tag set up properly.
Another option would be to 301 redirect the IP version of the page to the corresponding www. version.
If they still don't drop from the index you can use the URL Removal Tool in GWT, but you will have to set up a GWT account for each of the IP domains.
-
Thanks. Any suggestions on how to get Google to drop these pages (make them inactive)?
-
Hi,
Since doing the disallow on the IP address sites, they are no longer getting crawled.
** The disavow list won't stop google crawl those domain / pages. Google will just treat those links as no follow - so they won't pass Page Rank.
You will still see those in Web master tools, the links will still be active.
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Thanks!
-
Thanks. We are getting large daily crawls (nearly 100k a day) so fingers crossed this will sort it out soon.
-
Hi,
The canonical solution should be enough however I would still build some xml sitemaps and submit those via Web master Tools to speed the process. You can also build some html sitemaps with a clear structure and add those in the footer - again, to speed up the proces a little bit.
If you split the content into multiple xml sitemaps you can also track the crawling process.
You should also check your crawling speed in Web Master Tools to see how many pages in avarage the google bot is hitting each day - based on those numbers you can run some prediction on how long it will take more or less for google to re crawl your pages.
If your numbers is "bad" you will need to improve it some how to help with process - it can do wonders...
Hope it helps.
-
The canonical solution you have implemented is perfect. If you have decent authority and get deep crawls every couple days, you should be fine and pages from your IP should start to disappear shortly.
I would not worry about it anymore. You are on the right track. Sit back, relax and enjoy your flight
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best practice for internal links?
Hi our site is set up typically for a key product (money page) with 6 to 12 cluster pages, with a few more associated blog pages. If for example the key product was "funeral plans" what percentage of the internal anchor text links should be an exact match? Will the prominence of those links eg higher up the page have an impact on the amount of juice flowing? And do links in buttons count in the same way as on page anchor text eg "compare funeral plans"? Many thanks
Intermediate & Advanced SEO | | AshShep1
Ash1 -
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Please share best practices for subfolders and paths in a domain name
I am seeking feedback on the best way to proceed with regards to a project I am working on. Say for example the domain was domain.com and this site wanted to target specific markets such as realtors, attorneys, churches, and restaurants. Which URL structure would be better? domain.com/industries/attorneys or domain.com/attorneys Can I get your feedback along with any supporting articles. This is for a large ecommerce site but these particular pages are solely going to be used for marketing purposes to bring those site visitors to the website to let them know we understand their needs. Thanks for your help> Malcom
Intermediate & Advanced SEO | | PrintPlace.com0 -
Duplicate Content... Really?
Hi all, My site is www.actronics.eu Moz reports virtually every product page as duplicate content, flagged as HIGH PRIORITY!. I know why. Moz classes a page as duplicate if >95% content/code similar. There's very little I can do about this as although our products are different, the content is very similar, albeit a few part numbers and vehicle make/model. Here's an example:
Intermediate & Advanced SEO | | seowoody
http://www.actronics.eu/en/shop/audi-a4-8d-b5-1994-2000-abs-ecu-en/bosch-5-3
http://www.actronics.eu/en/shop/bmw-3-series-e36-1990-1998-abs-ecu-en/ate-34-51 Now, multiply this by ~2,000 products X 7 different languages and you'll see we have a big dupe content issue (according to Moz's Crawl Diagnostics report). I say "according to Moz..." as I do not know if this is actually an issue for Google? 90% of our products pages rank, albeit some much better than others? So what is the solution? We're not trying to deceive Google in any way so it would seem unfair to be hit with a dupe content penalty, this is a legit dilemma where our product differ by as little as a part number. One ugly solution would be to remove header / sidebar / footer on our product pages as I've demonstrated here - http://woodberry.me.uk/test-page2-minimal-v2.html since this removes A LOT of page bloat (code) and would bring the page difference down to 80% duplicate.
(This is the tool I'm using for checking http://www.webconfs.com/similar-page-checker.php) Other "prettier" solutions would greatly appreciated. I look forward to hearing your thoughts. Thanks,
Woody 🙂1 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
SEO Best Practices for Video Sites
What are the SEO Best Practices for video sites? Is there a guideline for this in SEOMOZ? Thanks in advance!
Intermediate & Advanced SEO | | merkal20050 -
Best practice to change the URL of all my site pages
Hi, I need to change all my site pages URL as a result of moving the site into another CMS platform that has its own URL structure: Currently the site is highly ranked for all relevant KWs I am targeting. All pages have backlinks Content and meta data should remain exactly the same. The domain should stay the same The plan is as follow: Set up the new site using a temporary domain name Copy over all content and meta data Set up all redirects (301) Update the domain name and point the live domain to the new one Watch closely for 404 errors and add any missing redirects Questions: Any comments on the plan? Is there a way (the above plan or any other) to make sure ranking will not be hurt What entries should I add to the sitemap.xml: new pages only or new pages and the pages from the old site? Thanks, Guy.
Intermediate & Advanced SEO | | jid1 -
Setting a 404, best practices
Is it enough to just delete a page, or is it necessary to do something else to 404 a page correctly? Is there a great link to explain how to set http status codes?
Intermediate & Advanced SEO | | nicole.healthline0