How to remove wrong crawled domain from Google index
-
Hello,
I'm running a Wordpress multisite. When I create a new site for a client, we do the preparation using the multisite domain address (ex: cameleor.cobea.be). To keep the site protected we use the "multisite privacy" plugin which allows us to restrict the site to admin only. When site is ready we a domain mapping plugin to redirect the client domain to the multisite (ex: cameleor.com).
Unfortunately, recently we switched our domain mappin plugin by another one and 2 sites got crawled by Google on their multsite address as well. So now when you type "cameleor" in Google you get the 2 domains in SERPS (see here http://screencast.com/t/0wzdrYSR).
It's been 2 weeks or so that we fixed the plugin issue and now cameleor.cobea.be is redirected to the correct address cameleor.com.
My question: how can I get rid of those wrong urls ? I can't remove it in Google Webmaster Tools as they belong to another domain (cf. cameleor.cobea.be for which I can't get authenticated) and I wonder if will ever get removed from index as they still redirect to something (no error to the eyes of Google)..?
Does anybody has an idea or a solution for me please ?
Thank you very much for your help
Regards
Jean-Louis
-
OK. Thank you very much Martijn and Dan for your support.
I'll wait a bit more then and see if it goes away.
Best regards
Jean-Louis
-
Hi Jean-Louis
In theory, 301 redirects should remove the original site/pages from the index. It can just take some time. I see that it is fact properly 301 redirecting now. I would actually not block the original (wrong) domain with robots.txt - allow Google to keep crawling it to see the 301 redirects on a regular basis and they should remove them.
In cases where I've seen Google not de-index 301 redirects, a canonical tag pointing to the new one has worked - but I don't think you need to go that far yet. I'd give it a little more time.
-Dan
-
Hi Jean-Louis,
Why don't you just create a robots.txt file for the multisite domain addresses you used to prepare your client sites for. By doing this you will exclude Google from crawling your sub domain and make sure that they ignore the rest of the URLs. This should definitely increase your chances of removing the duplicate urls out of their index.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is showing the Wrong Date for our Holiday.
Friday, December 16 is Free Shipping Day. However, Google says it's actually 2 days later. Does anybody know how we can get that changed? See: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=free shipping day
Intermediate & Advanced SEO | | lk990 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
Best way for Google and Bing not to crawl my /en default english pages
Hi Guys, I just transferred my old site to a new one and now have sub folder TLD's. My default pages from the front end and sitemap don't show /en after www.mysite.com. The only translation i have is in spanish where Google will crawl www.mysite.com/es (spanish). 1. On the SERPS of Google and Bing, every url that is crawled, shows the extra "/en" in my TLD. I find that very weird considering there is no physical /en in my urls. When i select the link it automatically redirects to it's default and natural page (no /en). All canonical tags do not show /en either, ONLY the SERPS. Should robots.txt be updated to "disallow /en"? 2. While i did a site transfer, we have altered some of the category url's in our domain. So we've had a lot of 301 redirects, but while searching specific keywords in the SERPS, the #1 ranked url shows up as our old url that redirects to a 404 page, and our newly created url shows up as #2 that goes to the correct page. Is there anyway to tell Google to stop showing our old url's in the SERP's? And would the "Fetch as Google" option in GWT be a great option to upload all of my url's so Google bots can crawl the right pages only? Direct Message me if you want real examples. THank you so much!
Intermediate & Advanced SEO | | Shawn1240 -
Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5) Here is an example level 1 - Homepage - http://vulcano.moldear.com.ar/ Level 2 - http://vulcano.moldear.com.ar/piscinas/ Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/ Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/ Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Does Google Index an Alert Div w/Delayed Hide
We have a div at the top of a client's the page that displays an alert to the user. After 30 seconds it is rendered hidden. Does Google index this? Does Google take this into account when it ranks the page?
Intermediate & Advanced SEO | | WEOMedia0 -
Redirecting a Page from Domain A to Domain B
We have a page on Domain A, an established and well-ranking website, that would be more appropriate on Domain B, a site that we launched about two years ago. This page ranks well, pulls nice search traffic and has traffic from external links. We would like to move the page and its traffic from Domain A to Domain B using a 301 redirect. Have you ever done this or have you heard of how it has worked for someone else? Thanks!
Intermediate & Advanced SEO | | EGOL0