Best way to de-index content from Google and not Bing?
-
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
-
Hi michelleh
The solution given by Dan above is the most reliable method as robots.txt will not block pages that googlebot finds via an external link to the page. Given the reasoning behind your desire to noindex, reliability is extremely important.
Also, you want "noindex, follow" rather than "noindex, nofollow" as the nofollow will trap any link value coming into the pages (from both internal and external links) and stop it from flowing through the site.
Hope that helps,
Sha
-
Is there any advantage to using "noindex, nofollow" over robots.txt? I've read that noindex, nofollow still accumulates pagerank for the page with the tag, but if we don't care about accumulating pagerank, is there any other advantage to using noindex, nofollow over robots.txt?
-
-
Robots.txt, you can remove from google to hurry it up, using WMT, but first you need to block them in robots.txt
User-agent: *
Disallow:User-agent: Google
Disallow: /bad
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content different URL - Google Analytics other Options
One of my clients came to me today asking to mirror their current website and host it on a different domain (i.e domain.com and domain.ca). Their reasoning is that there is no actual way to confirm via form entries from their website which visitors are submitting the form from an organic perspective and which ones are submitting the form through an Adwords ad campaign referral they are running (which I have nothing to do with). I know Google Analytics will show you visits to pages on a site and then you can find out which source (organic vs. cpc) they came from, but it won't confirm the source on the actual form entry. So my client feels the only way to find out this information would be to mirror the website so that the separate analytics would validate their Ad spend. Does anyone know of any tools that I could use for something like this? I do NOT want to mirror the website as I am fearful of duplicate content and the YEARS of organic SEO work I have put into this website going to waste. The other element I should mention is the client only wants to have this "mirorred" site up for 2 months. Any thoughts, suggestions and arguments for a mirorred website are welcomed! Thanks!
Intermediate & Advanced SEO | | MainstreamMktg0 -
Same language, Different countries. What would be the best way to introduce it?
Hello, We have a .com magento store with the US geo targeting We're going to launch a different versions soon, one for the US, and another one for Canada (we're going to add a Spanish and French versions later as well) The stores content will be same, except currency and contact us page. What would be a better strategy to introduce it to Google? What is better URL structure? example.com/ca/ , example.com/en-ca/ , or ca.example.com/ ? Should we stay with the original www.example.com/ (example.com) and just close an access to /ca/ and /us/ / or use rel=canonical / or use "alternate" hreflang to avoid duplicate content issues? Thanks in advance
Intermediate & Advanced SEO | | Meditinc.com0 -
Best way for Google and Bing not to crawl my /en default english pages
Hi Guys, I just transferred my old site to a new one and now have sub folder TLD's. My default pages from the front end and sitemap don't show /en after www.mysite.com. The only translation i have is in spanish where Google will crawl www.mysite.com/es (spanish). 1. On the SERPS of Google and Bing, every url that is crawled, shows the extra "/en" in my TLD. I find that very weird considering there is no physical /en in my urls. When i select the link it automatically redirects to it's default and natural page (no /en). All canonical tags do not show /en either, ONLY the SERPS. Should robots.txt be updated to "disallow /en"? 2. While i did a site transfer, we have altered some of the category url's in our domain. So we've had a lot of 301 redirects, but while searching specific keywords in the SERPS, the #1 ranked url shows up as our old url that redirects to a 404 page, and our newly created url shows up as #2 that goes to the correct page. Is there anyway to tell Google to stop showing our old url's in the SERP's? And would the "Fetch as Google" option in GWT be a great option to upload all of my url's so Google bots can crawl the right pages only? Direct Message me if you want real examples. THank you so much!
Intermediate & Advanced SEO | | Shawn1240 -
Huge Google index on E-commerce site
Hi Guys, I got a question which i can't understand. I'm working on a e-commerce site which recently got a CMS update including URL updates.
Intermediate & Advanced SEO | | ssiebn7
We did a lot of 301's on the old url's (around 3000 /4000 i guess) and submitted a new sitemap (around 12.000 urls, of which 10.500 are indexed). The strange thing is.. When i check the indexing status in webmaster tools Google tells me there are over 98.000 url's indexed.
Doing the site:domainx.com Google tells me there are 111.000 url's indexed. Another strange thing which another forum member describes here : Cache date has been reverted And next to that old url's (which have a 301 for about a month now) keep showing up in the index. Does anyone know what i could do to solve the problem?0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Best way to duplicate a wordpress site for staging purposes?
I want to make some changes to my Wordpress site, and want to somehow set up a live staging area. Does anyone know of a good way to do this? I want all of the same content there I just want to be able to make changes to it and try it all out before going live. Any thoughts on this? Also I want to be sure the staging site doesn't get indexed since it will be a complete duplicate of my existing site. Thanks!
Intermediate & Advanced SEO | | NoahsDad0 -
What is better for google: keep old not visited content deeply in the website, or to remove it?
We have quite a lot of old content which is not visited anymore. Should we remove it and have a lot of 410 errors which will be reported in GWT? Or should we keep it and forget about it?
Intermediate & Advanced SEO | | bele0 -
Is there a way to contact Google besides the google product forum?
Our traffic from google has dropped more than 35% and continues to fall. We have been on this forum and google's webmaster forum trying to get help. We received great advice, have waited months, but instead of our traffic improving, it has worsened. We are being penalized by google for many keywords such as trophies, trophies and awards and countless others - we were on page one previously. We filed two reconsideration requests and were told both times that there were no manual penalties. Some of our pages continue to rank well, so it is not across the board (but all of our listings went down a bit). We have made countless changes (please see below). Our busy season was from March to May and we got clobbered. Google, as most people know, is a monopoly when it comes to traffic, so we are getting killed. At first we thought it was Penquin, but it looks like we started getting killed late last year. Lots of unusual things happened - we had a large spike in traffic for two days, then lost our branded keywords, then our main keywords. Our branded keywords came back pretty quickly, but nothing else did. We have received wonderful advice and made most of the changes. We are a very reputable company and have a feeling we are being penalized for something other than spamming. For example, we have a mobile site we added late last year and a wholesale system was added around the same time. Since the date does not coincide with Penquin, we think there is some major technical driver, but have no idea what to do at this point. The webmasters have all been helpful, but nothing is working. We are trying to find out what one does in a situation as we are trying to avoid closing our business. Thank you! Changes Made: 1. We had many crawl errors so we reduced them significantly 2. We had introduced a mobile website in January which we
Intermediate & Advanced SEO | | trophycentraltrophiesandawards
thought may have been the cause (splitting traffic, duplicate content, etc.),
so we had our mobile provider add the site to their robots.txt file. 3. We were told by a webmaster that their were too many
links from our search provider, so we have them put the search pages in a
robots.txt file. 4. We were told that we had too much duplicate content. This was / is true, as we have hundred of legitate products that are similar:
example trophies and certificates that are virtually the same but are
for different sports or have different colors and sizes. Still, we added more content and added no index tags to many products. We compared our % of dups to competitors and it is far less. 5. At the recommendation of another webmaster, we changed
many pages that might have been splitting traffic. 6. Another webmaster told us that too many people were
linking into our site with the same text, namely Trophy Central and that it
might have appeared we were trying to game the system somehow. We have never bought links and don't even have a webmaster although over the last 10 years have worked with programmers and seo companies (but we don't think any have done anything unusual). 7. At the suggestion of another webmaster, we have tried to
improve our link profile. For example,
we found Yahoo was not linking to our url. 8. We were told to setup a 404 page, so we did 9. We were told to ensure that all of the similar domains
were pointing to www.trophycentral.com/ so we setup redirects 10. We were told that a site that we have linking to us from too many places so we reduced it to 1. Our key pages have A rankings from SEOMOZ for the selected keywords. We have made countless other changes recommended by experts
but have seen no improvements (actually got worse). I am the
president of the company and have made most of the above recent changes myself. Our website is trophycentral.com0