Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
-
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
-
Disavowing has nothing to do with traffic.
Disavowing is all about spam signals from spammy links. That and only that.
-
Thanks again for all the advice- Truly appreciated-
What are your thoughts on "disavowing" with google- murrayroofing.com so when it sends traffic to the new murrayroofingllc.com google will hopefully ignore...? Can you see our account in MOZ. You can see the old domain is sending traffic since it is listed on the spammy sites.
-
You are always welcome.
If you got more questions, you can always hit me up on my Twitter @DigitalSpaceman
-
Thank you!!
-
Hard to say who and why is putting you on those websites.
The only way to truly get rid of those backlinks is to reach out to those websites' owners. You'd have to obviously find someone who speaks the language.
Now, what you can do though is this:
- Disavow all those crappy links - that'll get Google to lower the "spam score" of your website;
- Block all traffic by IPs, geolocation and/or hostnames/referrers (that'll prevent from actual unrelated traffic)
That should clean it up pretty good.
Of course, that requires full control and ownership of that domain and website code. If you can't get that - again, my suggestion is just to part ways. -
This is awesome info! Thank you. What are your thoughts on trying to get backlinks removed from sites in China where we have no way to contact them - none of the wording o the sites are in our language- and it seems like it would be impossible to get removed from some of them. Additional thoughts greatly appreciated. In analytics we see "more" traffic from china than the US-
I'm convinced a competitor may be listing us on these sites- Or one of these SEO guys that get really pissed when we turn them down. Could they be out putting our domain on listing sites?
-
Yeah, your suggestion makes sense.
Keep the old one while the new one is ranking up.
Now, here is perfect scenario for you - keep working on the new site, and get full ownership of the old one. Then through IP blocks, cloudflare, removing all spammy backlinks etc, get rid of all or most of the spammy traffic and signals. And then redirect.
-
Thank you again!
I should have been more clear- The old website gets traffic that does convert- If it loaded faster than 10 seconds I'm sure a lot more would convert- Super high bounce rate due to slooooow loading of that site. But we do get "valid leads" every week from it. But not a lot of leads- maybe 5 a week- but our jobs are large dollar jobs.
What is your thought on running both sites separately? We could go in and make sure they are not duplicate and assign different addresses and phone numbers to the old site- But this "seems" black hat- We would not be doing it to get both site to rank- but just so we don't lose the traffic- then in a year or so get rid of it. what are your thoughts?
-
"... maybe a lot of traffic will convert. "
WILL convert? so it's not converting now? If so, it's kind of optimistic that will change, no?
Since you don't own old domain, you can't really reliably do anything about it anyway.
At this point, I would say not to forward at all, start from scratch.
-
Thank you- Yes some of the traffic - maybe a lot of traffic will convert. The problem is old "printed" directories and other places where we can't update the domain. We get a lot of business from a printed catalog that won;t change for a year or more.
I will look at the suggestions you made about IP limitations. The other issue is we don't "own" the original domain so we have to ask the owner who is also our IT guy to change settings. This is another reason we bough the new domain.
Again thank you!
-
Couple ways you can go about it.
-
Is any of the traffic going to the old spammy domain any good? Does it convert? If not, then don't worry about redirecting, there wouldn't be any point, only spam signals
-
If there is some good traffic, then do IP limitations, hostnames limitations etc. That can be done in htaccess or on the server itself. There are other more elaborate ways to filter out spam traffic as well, but that depends on how you or your IT guy is familiar with it. One of the simplest solutions is to route all traffic through CloudFlare, it has quite nice spam filtering, and it's free.
Hope this helps.
-
-
Thank you- we're talking about murrayroofinllc.com in particular- we are not sure how to forward the old domain to the new- We "know how" we just don't know if we should- The reason we developed murrayroofingllc.com is because murray roofing.com had a high spam score and we got advice from this string to go for a new domain-
Now the concern is- if we forward all the traffic from murrayroofing.com to murrayroofingllc.com that the new domain murrayroofingllc.com will be negatively affected by the spammy traffic- Somehow murrayroofing.com got on some spam sites and we get a ton of spammy traffic from china- we don't want this traffis - and these sites there is "no way" to ask them to remove our website from their spam sites in china.
All thoughts are welcome here-
-
Ta Larry
Ok nothing much of substance, that said if ranking worth trying as it is an easier or usually faster route to page 1.
Had a look at the Murray Roofing site and has not been optimised for customer queries a roofing contractor would seek to rank for. As it seems you are keen to start afresh - can do both in parallel. No harm to either.
That said would suggest you also look at your google my business structure - your effectively a local play. Getting reviews and appearing in the local search pack for roofing contractors Omaha etc we would consider a client priority.
All the best go get them.
-
only for a few and we are in position 49 and 50 for them.
-
Hi
Is the current site ranking for any terms of value?
-
Hi there,
Yes, absolutely get new domain. If you look at DA - it's only 15 (not too bad in some cases). But if you look at backlink profile - you'll see that most of the links are from listing sites - homestead, yellowpages, ezlocal etc. You can replicate that profile after a day of work. And, as you said, spam score will only bring troubles.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Can I Redirect an Old Domain to Our New Domain in .htaccess?
There is an old version of http://chesapeakeregional.com still floating around the web here: http://www.dev3.com.php53-24.dfw1-2.websitetestlink.com/component/content/category/20-our-services. Various iterations of this domain pop up when I do certain site:searches and for some queries as well (such as "Diagnostic Center of Chesapeake"). About 3 months ago the websitetestlink site had files and a fully functional navigation but now it mostly returns 404 or 500 errors. I'd like to redirect the site to our newer site, but don't believe I can do that in chesapeakeregional.com's .htaccess file. Is that so and would I need access to the websitetestlink .htaccess to forward the domain? Note* I (nor anyone else in our organization) has the login for the old site. The new site went live about 9 months before I arrived at the organization and I've been slowly putting the pieces together since arriving.
Intermediate & Advanced SEO | | smpomoryCRH0 -
Too many backlinks from one domain?
I've been in the process of creating a tourism-based website for the state of Kansas. I'm a photographer for the state, and have inked a nice little side income to my day job as a web designer by selling prints from Kansas (along with my travels elsewhere). I'm still in the process of developing it, but it's at least at a point that I need to really start thinking about SEO factor of the amount of backlinks I have from it going back to my main photography website. The Kansas site is at http://www.kansasisbeautiful.com and my photography website is http://www.mickeyshannon.com. This tourism website will serve a number of purposes: To promote the state and show people it's not just a flat, boring place. To help promote my photography. The entire site is powered by my photography. To sell a book I'm planning to publish later this year/early next year of Kansas images. To help increase sales of photography prints of my work. What I'm worried about is the amount of backlinks I have going from the Kansas site to my photography site. Not to mention every image is hosted on my photography domain (no need to upload to two domains when one can serve the same purpose). I'm currently linking back to my site on most pages via a little "Like the Photos? Buy a print" link in the top right corner. In addition, when users get to the website map, all photo listings click back to a page on my photography site that they can purchase prints. And the main navigation also has a link for "Photos" that takes them to my Kansas photo galleries on my photography website as well. The question I have: Is it really bad SEO-wise to have anywhere from 1 to 10+ backlinks on every page from one domain (kansasisbeautiful.com) linking back to mickeyshannon.com? Would I be better served moving all of the content from kansasisbeautiful into a subdirectory on my photography site (mickeyshannon.com/kansas/) and redirecting the entire domain there? I haven't actually launched this website yet, so I'm trying to make the right call before pushing it to the public. Any advice would be appreciated!
Intermediate & Advanced SEO | | msphoto0 -
Domain dominance
I've just started to work for a company who've purchased masses of domains with every conceivable permutation based on all their products with every extension possible e.g .biz . eu. .net (including .co.uk and .com of course). I have two questions: 1. Is it worth keeping all these (they want to add more) domains or let them expire? 2. All the purchased domains are online - is there any point (they redirect with a 301)?
Intermediate & Advanced SEO | | LJHopkins0 -
Referring domain issues
Our website (blahblah).org has 32 other domains pointing to it all from the same I.P address. These domains including the one in question, were all purchased by the website owner, who has inadvertently created duplicate content and on most of these domains. Some of these referring domains have 301's, some don't - but it appears they have all been de-indexed by Google. I'm somewhat out of my depth here (most of what I've said above has come from an agency who said we should address this before being slapped by Google). However I need to explain to my line manage the actual issues in more detail and the repercussions - any anyone please offer advice please? I'm happy to use the agency, or another - but would like some second opinions if possible?
Intermediate & Advanced SEO | | LJHopkins0 -
Microsites: Subdomain vs own domains
I am working on a travel site about a specific region, which includes information about lots of different topics, such as weddings, surfing etc. I was wondering whether its a good idea to register domains for each topic since it would enable me to build backlinks. I would basically keep the design more or less the same and implement a nofollow navigation bar to each microsite. e.g.
Intermediate & Advanced SEO | | kinimod
weddingsbarcelona.com
surfingbarcelona.com or should I rather go with one domain and subfolders: barcelona.com/weddings
barcelona.com/surfing I guess the second option is how I would usually do it but I just wanted to see what are the pros/cons of both options. Many thanks!0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Should I buy a .co domain if my preferred .com and .co.uk domain are taken by other companies?
I'm looking to boost my website ranking and drive more traffic to it using a keyword rich domain name. I want to have my nearest city followed by the keyword "seo" in the domain name but the .co.uk and .com have already been taken. Should I take the plunge and buy .co at a higher price? What options do I have? Also whilst we're on domains and URL's is it best to separate keywords in url's with a (_) or a (-)? Many thanks for any help with this matter. Alex
Intermediate & Advanced SEO | | SeoSheikh0 -
How to clean up a SERP?
I have a new customer and he wants me to clear up the SERP for his branded keyword, the SERP currently has his site and two other sites related to him under his result... Under that is bad reviews and old reports. My client does own the top spot (#1) for his branded name. My client has a: linkedin facebook twitter myspace I was thinking to push all these to the first page, this will clear up some of those bad reviews. What are your thoughts? Have any of you ever had this type of case? I need to get 6 different sites to all rank for the same exact key term, however I have the top spot to link from...
Intermediate & Advanced SEO | | SEODinosaur0