Removing/ Redirecting bad URL's from main domain
-
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain.
This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation.
About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP.
We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain).
This should have been done from the beginning, but it wasn't.
Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
-
Yes, that's correct Kurt. We want to disassociate our brand from those pages. Thanks for your FB!
-
Yes, very helpful... Thanks!
-
It sounds to me like you don't want the search engines to know that your moving the content, but rather have them think that you have dropped the pages from your site because you don't want the search engines associating those pages with your site, correct?
If that's the case, then you do want to keep the noindex on the old pages and setup 301 redirects as well. The redirects are for real users who happen to use any links/bookmarks to the old pages. By keeping the old pages noindexed, then hopefully the search engines won't crawl them and won't follow the redirects. I'd also remove the pages from the Google and Bing indexes in their webmaster tools for good measure.
If you are linking from your site to the new location of the user content, you may want to nofollow those links or, better yet, create the links in javascript or something to hide them. If all the links to the content just shift to the new location, Google and Bing may still associate it your site with the new site. Then again, if all the content from the old pages is all the new site, then they may figure it all out anyway.
-
You need to get rid of the robots.txt block on those URLs you want to redirect, Alec.
As it is now with the robots block in place, you've told the search engines NOT to crawl those URLs So it's going to be very difficult for them to discover the 301 redirects and learn that they should be dropping the old URLs form the index. After that, it is just a matter of time. (It can also help to leave those old URLs in the xml sitemap for a while to make it easier for the engines to crawl them and discover the 301s)
If none of those URLs were generating any substantial amount of traffic or incoming links, you could also use Google and Bing Webmaster Tools to request that the pages be removed from the index. This will only really work if the pages are organised in a specific directory, as it would likely take far too long to annotate each URL for removal otherwise.
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO yes/no?
We receive links from fake websites, these website are copy's from real websites that link to us, but sometimes the links are changes, as for example one link is called 'tank weapon with hitler', we are a insurance comparison website (a bit of topic). The real websites that link to us are copied and placed on .ga .tk etc domains: For example: wahlrsinnsa.ga, loungihngsa.ga, pajapritosa.cf, rgeitsportsa.cf, sospesvoasa.tk I received spam links on other domains with comments spam etc, this doesnt really work, but in this case we really suffer in our rankings (from position 1 to 5 etc). Not sure if this is negative SEO and if this is really the reason we lost some rankings, but it's a bit of a coincidence the domains come in google webmaster in the same period we suffer a downgrade in our rankings. My question: Is this negative SEO, or is it something automatic. And do I need to disavow the links/domains? The real versions of the websites (on other domains with .nl) give the website autority.
White Hat / Black Hat SEO | | remkoallertz0 -
Backlinks from customers' websites. Good or bad? Violation?
Hi all, Let's say a company holds 100 customers and somehow getting a backlink from all of their websites. Usually we see "powered by xyz", etc. Is something wrong with this? Is this right backlinks strategy? Or violation of Google guidelines? Generally most of the customers's websites do not have good DA; will it beneficial getting a backlinks from such average below DA websites? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Remedies, Cure, and Precautions for 302 redirect Hijacking.
Hi Moz Guys, I hope all of you are good out there. I am here to discuss remedies, cure, and precautions for 302 redirect hijacking. Although it is quite old and whenever I searched in Google, it looks like a long gone glitch of Google serps but it just happened to one of my customers' site. The site in question is www(dot)solidswiss(dot)cd. If you check the cache(cache:site) then you can see a hijacked site in the urls of the cached page. As a result all my customer's listing in the serps are replaced with this site. This hacked site then is redirecting to a competitor's site. I did many things to cop with the problem, site came back in the serps but hackers are doing this on lots of domains so when it recovered from one site then another site catches it. I am doing lots of reporting on submit spam site. I am doing lots of feedback on the serps page. I have switched to https . But seems like nothing is working. This community is full of experts and technical people. I am wondering that what are your views and suggestions to handle the problem permanently?
White Hat / Black Hat SEO | | adqas0 -
Google's Related Searches - Optimizing Possible?
Does anyone know how Google determines what suggestions show up at the bottom of SERPs? I've been working with a client to boost his local ranking, but every time we do a branded search for his business his competitors keep popping up in the "Searches related to ______" section.
White Hat / Black Hat SEO | | mtwelves0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Has anyone used this? www.linkdetox.com/
Has anyone used this? www.linkdetox.com/ Any opinions about it?
White Hat / Black Hat SEO | | Llanero0 -
What's been your experience with profile link-building?
What have your experiences been? Short Term? Long Term? There isn't a lot written about it, and I'm wondering where it falls in the order of things. I was very hesitant to jump in, but have launched a few campaigns, both for local geo targeting phrases, and national accounts. Surprisingly, I've seen a surge in rankings, but also wonder how short lived they will be. I've noticed the links still don't come up in tools like open site explorer, but I'm able to find them when searching for the unique username I used while building the profiles. The sites I'm listing on have no relevance to industry, unless by chance, although the PR's I'm using are all 4 or higher. Is this considered gray hat?
White Hat / Black Hat SEO | | skycriesmary720