What is best practice to eliminate my IP addr content from showing in SERPs?
-
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center).
Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that).
I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows.
As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical.
So here are my questions.
-
Is there a better way?
-
If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
-
-
I would allow Google to crawl those pages for a little while longer just to ensure that they see the rel canonical tags. Then once you feel that they have recrawled the IP address pages you can disallow them again if you want, thought that isn't entirely necessary if you have the rel canonical tag set up properly.
Another option would be to 301 redirect the IP version of the page to the corresponding www. version.
If they still don't drop from the index you can use the URL Removal Tool in GWT, but you will have to set up a GWT account for each of the IP domains.
-
Thanks. Any suggestions on how to get Google to drop these pages (make them inactive)?
-
Hi,
Since doing the disallow on the IP address sites, they are no longer getting crawled.
** The disavow list won't stop google crawl those domain / pages. Google will just treat those links as no follow - so they won't pass Page Rank.
You will still see those in Web master tools, the links will still be active.
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Sorry - I just thought of something that could pose a problem and was hoping to get your advice.
Since doing the disallow on the IP address sites, they are no longer getting crawled. Does that mean that the canonical tags within those IP address sites wont be able to do their work?
Or
Will the canonicals picked up from the proper domain help the search engines know they should consolidate the indexed pages from the now disallowed IP addresses?
I am seeing that the IP addresses are no longer being crawled, and the pages in their indexes about the same (not going down).
Thoughts?
-
Thanks!
-
Thanks. We are getting large daily crawls (nearly 100k a day) so fingers crossed this will sort it out soon.
-
Hi,
The canonical solution should be enough however I would still build some xml sitemaps and submit those via Web master Tools to speed the process. You can also build some html sitemaps with a clear structure and add those in the footer - again, to speed up the proces a little bit.
If you split the content into multiple xml sitemaps you can also track the crawling process.
You should also check your crawling speed in Web Master Tools to see how many pages in avarage the google bot is hitting each day - based on those numbers you can run some prediction on how long it will take more or less for google to re crawl your pages.
If your numbers is "bad" you will need to improve it some how to help with process - it can do wonders...
Hope it helps.
-
The canonical solution you have implemented is perfect. If you have decent authority and get deep crawls every couple days, you should be fine and pages from your IP should start to disappear shortly.
I would not worry about it anymore. You are on the right track. Sit back, relax and enjoy your flight
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Brightcove SEO Best Practices?
My company has decided to go with Brightcove as a video platform so we can better monetize all of the video content we create and better customize the experience as well. We have a pretty decent YouTube presence, and I won't let them stop using that because it would totally alienate us from part of our audience. So I was hoping someone could help me with the following: Are we able to keep videos hosted on YouTube as well as Brightcove without any risk of duplicate content? If we use the Brightcove player to embed videos in our on-site content, are we hindering potential organic search visibility? On the embeds, it's looking like it's using an iframe in our content (https://www.leafly.com/news/cannabis-101/sativa-indica-and-hybrid-whats-the-difference-between-cannabis-ty) - We're using a Brightcove WP plugin for the embed, but I was wondering if anyone had suggestions on a better way to implement/if this is even an issue at all. Are there any other general best practices/insights anyone has working with this platform? I found this article on their site, but I was wondering if there was anything else I should consider. Thank you in advance for any insights/answers!
Intermediate & Advanced SEO | | davidkaralisjr0 -
SEO Best eCommerce Practice - Same Product Different Keywords
I want to target different keywords for the same e-commerce product. What's the best SEO practice? I'm aware of the pitfalls to keyword stuffing. The product example is the GoPro Hero 5 Action Camera. The same action camera can be used in many different activities, e.g. surfing, auto racing, mountain biking, sky diving, search & rescue, law enforcement etc. These activities target completely different markets, so naturally the keywords are different. I have three strategies to tackle the issue. Please let me know which one you think is best. 1) Create different keyword landing pages with a call-to-action to the same conversion page Each landing page will be optimized for the targeted keywords e.g. surfing, auto racing, mountain biking, sky diving, search & rescue etc. Obviously this will be a big task because there will be numerous landing pages. Each page will show how the product can be used in these activities. For Surfing, the content would include surfing images with the GoPro Hero 5, instructions on how to mount the camera to a surfboard, waterproof tests, surfing testimonials and surfing owner reviews, etc. The call-to-action leads to a generic product conversion page displaying product information such as specs, weight, video formats, price, shipping, warranty etc. The same product page will be the call-to-action for all keyword landing pages. Positives Vast number of targeting long-tail keywords, numerous landing pages Good specific user experience who may be looking for "underwater action camera" (specific mounting instructions related to surfboards etc.) Less duplicate content as there is only one product page showing the same information Negatives Challenging to come up with each page for the vast amount of activities. Inbound Link Considerations
Intermediate & Advanced SEO | | ChrisCK
Inbound links from publications can link directly to the product page or the keyword landing page Surf Magazine may link to:
"Surfing Action Camera | GoPro Hero 5 | GoPro.com" - gopro.com/hero5/underwater-surf-camera
"GoPro Hero 5 Action Camera | GoPro.com" - gopro.com/hero5 2) Create different keyword landing pages with call-to-action to directly add product to cart Similar to the first option, but the call-to-action on the landing page is to Add Hero 5 to Cart. The user experience will be similar, the content creation challenges will be similar, but the techy product info e.g. specs, price, video format, etc. will be displayed on the same landing page. Positives Same benefit to long-tail keywords targeting Same benefit to a good, specific user experience Negatives Same challenges to create each long-tail keyword landing page Since there is no aggregate "product page", inbound links will be split between the landing pages Splitting of Page Authority to each landing conversion page Surf Magazine will link to:
"Surfing Action Camera | GoPro Hero 5 | GoPro.com" - gopro.com/hero5/underwater-surf-camera
Cycling Magazine will link to:
"Cycling Action Camera | GoPro Hero 5 | GoPro.com" - gopro.com/hero5/cycling-camera 3) Create conversion-focused product page with casual blog about keywords This is currently what GoPro has chosen - GoPro Hero 5. The product page displays the many different types of activities on the same page. The page is focused on the user experience with images of the action camera being used in different cool activities, showing its versatility. Note, very little long-tail keyword targeting on this page, instead they could use a broad keyword "action camera". To target long-tails, maybe a blog can be used brand ambassadors displaying the product being used in the various activities. Positives User experience focused Higher conversion rate Less content creation work Inbound links go to the same product page, building Page Authority Negatives Poor ranking with short-tail keyword (GoPro is not even in Top 10 SERP for "action camera") Poor ranking with long-tail keywords, (GoPro doesn't rank for "diving camera, cycling camera, surf camera") For blogging the long-tail keywords, who really converts from landing on a blog of the actual seller?! I hope those three strategies were explained clear enough and have enough of a differentiator. Please let me know what you think!0 -
No content using Fetch
Wooah, this one makes me feel a bit nervous. The cache version of the site homepage shows all the text, but I understand that is the html code constructed by the browser. So I get that. If I Google some of the content it is there in the index and the cache version is yesterday. If I Fetch and Render in GWT then none of the content is available in the preview - neither Googlebot or visitor view. The whole preview is just the menu, a holding image for a video and a tag line for it. There are no reports of blocked resources apart from a Wistia URL. How can I decipher what is blocking Google if it does not report any problems? The CSS is visible for reference to, for example, <section class="text-within-lines big-text narrow"> class="data"> some content... Ranking is a real issue, in part by a poorly functioning main menu. But i'm really concerned with what is happening with the render.
Intermediate & Advanced SEO | | MickEdwards0 -
Best practices on setting up multi country Magento store
We run Magento and we're in the process of redesigning our site. We want the site to have separate storefronts for different countries, however we won't have the site language translated initially. We're thinking we'll use the Magento multi-store feature and have sites like /fr, /de /en-us, /en-au, etc. Is the best practice to use hreflang and for the non-english stores which haven't yet been translated? For example set them as, for French users: Essentially saying, the page is aimed at French people, but is in English. The separate storefronts will have things like currency and tax localised to each country and will gradually be getting translated, especially the more generic stuff like "Add to Cart", "Checkout" etc. Or, should it be targeted at French language and country, despite not all being translated into French? Or is there a better way to do this?
Intermediate & Advanced SEO | | seanmccauley0 -
What is best practice SEO approach to re structuring a website with multiple domains and associated search engine rankings for each domain?
Hello Mozzers, I'm trying to improve and establish rankings for my website which has never really been optimised. I've inherited what seems to be a mess and have a challenge for you! The website currently has 3 different www domains all pointing to the one website, two are .com domains and one is a .com.au - the business is located in Australia and the website is primarily targeting Australian traffic. In addition to this there are a number of other non www domains for the same addresses pointing to the website in the CMS which is Adobe Business Catalyst. When I check Google each of the www domains for the website has the following number of pages indexed: www.Domain1,com 5,190 pages
Intermediate & Advanced SEO | | JimmyFlorida
www.Domain2.com 1,520 pages
www,Domain3.com.au 149 pages What is best practice approach from an SEO perspective to re organising this current domain structure? 1. Do I need to use the .com.au as the primary domain given that we are in this market and targeting traffic here? Thats what I have been advised and it seems to be backed up by what I have read here. 2. Do we re direct all domains to the primary .com.au domain? This is easily done in the Adobe Business Catalyst CMS however is this the same as a 301 redirect which is the best approach from an SEO perspective? 3. How do we consolidate all of the current separate domain rankings for the 3 different domains into the one domain rankings within Google to ensure improved rankings and a best practice approach? The website is currently receiving very little organic search traffic so if its simpler and faster to start again fresh rather than go through a complicated migration or re structure and you have a suggestion here please feel free to let me know your ideas! Thank you!0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0 -
Removing large section of content with traffic, what is best de-indexing option?
If we are removing 100 old urls (archives of authors that no longer write for us), what is the best option? we could 301 traffic to the main directory de-index using no-index, follow 404 the pages Thanks!
Intermediate & Advanced SEO | | nicole.healthline0