Removing/ Redirecting bad URL's from main domain
-
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain.
This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation.
About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP.
We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain).
This should have been done from the beginning, but it wasn't.
Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
-
Yes, that's correct Kurt. We want to disassociate our brand from those pages. Thanks for your FB!
-
Yes, very helpful... Thanks!
-
It sounds to me like you don't want the search engines to know that your moving the content, but rather have them think that you have dropped the pages from your site because you don't want the search engines associating those pages with your site, correct?
If that's the case, then you do want to keep the noindex on the old pages and setup 301 redirects as well. The redirects are for real users who happen to use any links/bookmarks to the old pages. By keeping the old pages noindexed, then hopefully the search engines won't crawl them and won't follow the redirects. I'd also remove the pages from the Google and Bing indexes in their webmaster tools for good measure.
If you are linking from your site to the new location of the user content, you may want to nofollow those links or, better yet, create the links in javascript or something to hide them. If all the links to the content just shift to the new location, Google and Bing may still associate it your site with the new site. Then again, if all the content from the old pages is all the new site, then they may figure it all out anyway.
-
You need to get rid of the robots.txt block on those URLs you want to redirect, Alec.
As it is now with the robots block in place, you've told the search engines NOT to crawl those URLs So it's going to be very difficult for them to discover the 301 redirects and learn that they should be dropping the old URLs form the index. After that, it is just a matter of time. (It can also help to leave those old URLs in the xml sitemap for a while to make it easier for the engines to crawl them and discover the 301s)
If none of those URLs were generating any substantial amount of traffic or incoming links, you could also use Google and Bing Webmaster Tools to request that the pages be removed from the index. This will only really work if the pages are organised in a specific directory, as it would likely take far too long to annotate each URL for removal otherwise.
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Value / Risk of links in comments (nofollow)
Recently I noticed a couple of comments on our blog that seemed nice and relevant so I approved them. The site is wordpress and comments are configured nofollow. We don't get many comments so I thought "why not?". Today I got one and noticed they are all coming from the same IP. They all include urls to sites in the same industry as us, relevant sites and all different. Looks like an SEO is doing it for various clients. My question is what is the value of these nofollow links for the poster? Are these seen as "mentions" and add value to Google? And am I better off trashing them so my site is not associated? Thanks
White Hat / Black Hat SEO | | Chris6610 -
What is the difference between rel canonical and 301's?
Hi Guys I have been told a few times to add the rel canonical tag to my category pages - however every category page actually is different from the other - besides the listings that I have for my staff on each pages. Some of them specialise in areas that cross over in other areas - but over really if I'm re directing for eg: Psychic Readings over to Love and Relationships because 5 of my staff members are in both categories - the actual delivering of content and in depth of the actual category which skills are provided at different levels don't justify me creating a rel tag from Psychic Readings over to Love and Relationships just because i have 5 staff members listed under both categories. Tell me have I got this right or completely wrong? Here is an eg: Psychic Readings category https://www.zenory.com/psychic-readings And love and relationships category - https://www.zenory.com/love-relationships Hope this makes sense - I really look forward to your guys feedback! Cheers
White Hat / Black Hat SEO | | edward-may0 -
Redirect from old domain to a new domain
Hi, assuming i have an old domain that i would like to redirect it to the new domain because the old domain contain good links on it and been ranking for its keywords. Would it be a wise choice? and can i redirect my sub domain into my new one too? for example website1.com/life > website2.com/life and how do i do so? can i do that by hosting the old domain in my new domain hosting and do all those redirect include sub domain redirect?
White Hat / Black Hat SEO | | andzon0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
NoFollow tag for external links: Good or bad?
I have a few sites that have tens of thousands of links on them (most of them are sourcing images that happen to be external links). I know that it's a good thing to externally link to reputable sources, but is it smart to place the nofollow tag on ALL external links? I'm sure there is a good chance that external links from posts from years ago are pointing to sites that may now be penalized. I feel as though nofollowing all the external links could come off as unnatural. What are the pros and cons of placing the nofollow tag on ALL external links, and also if I leave it as is and don't put the nofollow tag on them. Thanks.
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
Index page de-indexed / banned ?
Yesterday google removed our index page from the results. Today they also removed language subdomains (fr.domain.com).. Index page, subdomains are not indexed anymore. Any suggestions? -- No messages in GWT. No malware. Backlink diversification was started in May. Never penguilized or pandalized. Last week had the record of all times of daily UV. Other pages still indexed and driving traffic, left around 40% of total. Never used any black SEO tool. 95% of backlinks are related; sidebar, footer links No changes made of index page for couple months.
White Hat / Black Hat SEO | | bele0 -
Google Bombing For A Specific URL
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin The page does not contain the word "Beruk". External links to the page do not contact the anchor-text "Beruk" Given the above scenario, how is the page still ranking on first page for this keyword?
White Hat / Black Hat SEO | | rajeevbala0