Getting rid of a site in Google
-
Hi,
I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google.
Google reports millions of inbound links from Site B to Site A
I want to get rid of Site B, because its duplicate content.
First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months)
I also tried to change all the pages on Site B to 404 pages, but this did not work either
I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase.
What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases
Any suggestions?
-
The sites are massive and we are talking massive numbers:
Google reports in WMT that site B still has 259,157,970 links to site A, although when you filter into the report it only shows a few
The current state is that nothing is blocked on Site B, and ALL pages point to the landing page of Site B.
In WMT for site B, G still shows data for all the reports, like search queries, keywords, crawl errors (very old and all fixed) and so on. The reports and data does not bother me as much as the 259,157,970 links it reports on Site A.
On the 11th of April when I started the process of getting rid of these links, there were 554,066,716, this jumped up to 603,404,378 on the 28th of April. It started dropping and was as low as 122,405,100 on the 17th of May, and then started growing again up to where it is now 259,157,970
I also noticed that when the pages was giving 404s that the crawl rate of google dropped to zero, now that its redirecting to the landing page, the crawl rate is back up to about 1,800 per day, which is still very low, considering the numbers we are talking about.
The crawl rate on Site A is okay, at 220,000 per day, but it was as high as 800,000 per day at one stage.
-
If you remove all history of a website it may still appear in the wayback machine.
If you first blocked robots then they wont create the 301 links, they'll just keep the previously cached pages? Maybe remove the robots.txt and let google index every page with the 301 to the landing page, then after they've indexed add the robot.txt back. Have you tried submitting a new sitemap in Webmaster tools pointing all pages at the landing page?
Roughly how many pages are in your website?
-
I failed to mention that both sites A and B had the exact same content, database and URL structure, with the only difference being the sub domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Reporting Webspam to Google
We are in ecommerce, and there are a few review sites that are dominating the rankings for our products. The sites are very good - very well written content (2000+ words) and visually appealing sites. The 2 main culprits are clearly black hat. One site's backlinks are pure spam, and the other is buying footer and sidebar links. Will ratting them to Google have any impact? If not, any suggestions on how to compete? Our competing pages are product descriptions, and creating a 2000 word product description seems inappropriate. Also, all of these products are brand new, and due to extensive media spends, the search volume is very high. Since they are beating us to the punch by getting good content posted first, they are proving difficult to displace.
Intermediate & Advanced SEO | | AMHC0 -
I would like to get rid of 300,000+ links, please
A couple of months ago, I noticed that in Webmaster Tools my site had acquired 300,000+ links from a single site, updown.com. It seems to be a reputable site, and also in the correct industry, so I wrote to them and said that we love links, but that was probably a few too many and they all go to our privacy policy page. I suggested that they had some type of error that they might want to fix. After a month with no response, I wrote again, and still no response. This is now a month after that.
Intermediate & Advanced SEO | | Linda-Vassily
The strange thing is that I don't see the links when I visit their pages, even in the source (Google provides a list of sample linking pages). I also don't see those links in Open Site Explorer, Majestic, AHREFs, nor Screaming Frog. If I were seeing this anywhere else, I'd just ignore it as some type of glitch. But this is information from Google. I have not received any warnings nor manual actions and I am disinclined to open a disavow can of worms, since the site is doing well and I'd rather not stir things up if I don't have to. Any thoughts about what I should (or shouldn't) do? Is this a problem, or should I assume Google knows it is a glitch and will ignore it? It has been in my Webmaster Tools for about three months. Thanks for reading!0 -
Does having all client websites on same server/same Google Analytics red flag Google?
If you have several clients, and they are all on the same server, and also under ONE Google Analytics account, will that negatively impact with Google? They all have different content and addresses, some have the same template, but with different images.
Intermediate & Advanced SEO | | BBuck1 -
Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5) Here is an example level 1 - Homepage - http://vulcano.moldear.com.ar/ Level 2 - http://vulcano.moldear.com.ar/piscinas/ Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/ Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/ Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
How to remove an entire site from Google?
Hi people, I have a site with around 2.000 urls indexed in google, and 10 subdomains indexed too, which I want to remove entirely, to set up a new web. Which is the best way to do it? Regards!
Intermediate & Advanced SEO | | SeoExpertos0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0