Removing/ Redirecting bad URL's from main domain
-
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain.
This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation.
About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP.
We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain).
This should have been done from the beginning, but it wasn't.
Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
-
Yes, that's correct Kurt. We want to disassociate our brand from those pages. Thanks for your FB!
-
Yes, very helpful... Thanks!
-
It sounds to me like you don't want the search engines to know that your moving the content, but rather have them think that you have dropped the pages from your site because you don't want the search engines associating those pages with your site, correct?
If that's the case, then you do want to keep the noindex on the old pages and setup 301 redirects as well. The redirects are for real users who happen to use any links/bookmarks to the old pages. By keeping the old pages noindexed, then hopefully the search engines won't crawl them and won't follow the redirects. I'd also remove the pages from the Google and Bing indexes in their webmaster tools for good measure.
If you are linking from your site to the new location of the user content, you may want to nofollow those links or, better yet, create the links in javascript or something to hide them. If all the links to the content just shift to the new location, Google and Bing may still associate it your site with the new site. Then again, if all the content from the old pages is all the new site, then they may figure it all out anyway.
-
You need to get rid of the robots.txt block on those URLs you want to redirect, Alec.
As it is now with the robots block in place, you've told the search engines NOT to crawl those URLs So it's going to be very difficult for them to discover the 301 redirects and learn that they should be dropping the old URLs form the index. After that, it is just a matter of time. (It can also help to leave those old URLs in the xml sitemap for a while to make it easier for the engines to crawl them and discover the 301s)
If none of those URLs were generating any substantial amount of traffic or incoming links, you could also use Google and Bing Webmaster Tools to request that the pages be removed from the index. This will only really work if the pages are organised in a specific directory, as it would likely take far too long to annotate each URL for removal otherwise.
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Footer images links, good or bad?
Hi everybody! I have a very serius question because i have a problem with this. We run a website of voucher codes and we are looking that our rivals are putting their logos on footers of online stores with images, sometimes link to home, sometimes link to store within webpage. Should i ask for the same to online stores? I have scary to get a penalty by Google. Please help me with this and recommend me something because we are doing fair play but rivals are doing this and they get best results in SERPS. Thanks very much! Best regards!
White Hat / Black Hat SEO | | pompero990 -
G.A. question - removing a specific page's data from total site's results?
I hope I can explain this clearly, hang in there! One of the clients of the law firm I work for does some SEO work for the firm and one thing he has been doing is googling a certain keyword over and over again to trick google's auto fill into using that keyword. When he runs his program he generates around 500 hits to one of our attorney's bio pages. This happens once or twice a week, and since I don't consider them real organic traffic it has been really messing up my GA reports. Is there a way to block that landing page from my overall reports? Or is there a better way to deal with the skewed data? Any help or advice is appreciated, I am still so new to SEO I feel like a lot of my questions are obvious, but please go easy on me!
White Hat / Black Hat SEO | | MyOwnSEO0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Penalty for all new sites on a domain?
Hi @all, a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved. We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem! 😉 Thanks!!!
White Hat / Black Hat SEO | | TheLastSeo0 -
Redirect n domain to one
What happen when I redirect301 10 domain to one? I have 10 domain with ave Page Authority=45 and Domain Authority 60 and want to increase my new domain by redirect them. is it right or wrong?
White Hat / Black Hat SEO | | vahidafshari450 -
What's been your experience with profile link-building?
What have your experiences been? Short Term? Long Term? There isn't a lot written about it, and I'm wondering where it falls in the order of things. I was very hesitant to jump in, but have launched a few campaigns, both for local geo targeting phrases, and national accounts. Surprisingly, I've seen a surge in rankings, but also wonder how short lived they will be. I've noticed the links still don't come up in tools like open site explorer, but I'm able to find them when searching for the unique username I used while building the profiles. The sites I'm listing on have no relevance to industry, unless by chance, although the PR's I'm using are all 4 or higher. Is this considered gray hat?
White Hat / Black Hat SEO | | skycriesmary720