Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
-
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
-
Disavowing has nothing to do with traffic.
Disavowing is all about spam signals from spammy links. That and only that.
-
Thanks again for all the advice- Truly appreciated-
What are your thoughts on "disavowing" with google- murrayroofing.com so when it sends traffic to the new murrayroofingllc.com google will hopefully ignore...? Can you see our account in MOZ. You can see the old domain is sending traffic since it is listed on the spammy sites.
-
You are always welcome.
If you got more questions, you can always hit me up on my Twitter @DigitalSpaceman
-
Thank you!!
-
Hard to say who and why is putting you on those websites.
The only way to truly get rid of those backlinks is to reach out to those websites' owners. You'd have to obviously find someone who speaks the language.
Now, what you can do though is this:
- Disavow all those crappy links - that'll get Google to lower the "spam score" of your website;
- Block all traffic by IPs, geolocation and/or hostnames/referrers (that'll prevent from actual unrelated traffic)
That should clean it up pretty good.
Of course, that requires full control and ownership of that domain and website code. If you can't get that - again, my suggestion is just to part ways. -
This is awesome info! Thank you. What are your thoughts on trying to get backlinks removed from sites in China where we have no way to contact them - none of the wording o the sites are in our language- and it seems like it would be impossible to get removed from some of them. Additional thoughts greatly appreciated. In analytics we see "more" traffic from china than the US-
I'm convinced a competitor may be listing us on these sites- Or one of these SEO guys that get really pissed when we turn them down. Could they be out putting our domain on listing sites?
-
Yeah, your suggestion makes sense.
Keep the old one while the new one is ranking up.
Now, here is perfect scenario for you - keep working on the new site, and get full ownership of the old one. Then through IP blocks, cloudflare, removing all spammy backlinks etc, get rid of all or most of the spammy traffic and signals. And then redirect.
-
Thank you again!
I should have been more clear- The old website gets traffic that does convert- If it loaded faster than 10 seconds I'm sure a lot more would convert- Super high bounce rate due to slooooow loading of that site. But we do get "valid leads" every week from it. But not a lot of leads- maybe 5 a week- but our jobs are large dollar jobs.
What is your thought on running both sites separately? We could go in and make sure they are not duplicate and assign different addresses and phone numbers to the old site- But this "seems" black hat- We would not be doing it to get both site to rank- but just so we don't lose the traffic- then in a year or so get rid of it. what are your thoughts?
-
"... maybe a lot of traffic will convert. "
WILL convert? so it's not converting now? If so, it's kind of optimistic that will change, no?
Since you don't own old domain, you can't really reliably do anything about it anyway.
At this point, I would say not to forward at all, start from scratch.
-
Thank you- Yes some of the traffic - maybe a lot of traffic will convert. The problem is old "printed" directories and other places where we can't update the domain. We get a lot of business from a printed catalog that won;t change for a year or more.
I will look at the suggestions you made about IP limitations. The other issue is we don't "own" the original domain so we have to ask the owner who is also our IT guy to change settings. This is another reason we bough the new domain.
Again thank you!
-
Couple ways you can go about it.
-
Is any of the traffic going to the old spammy domain any good? Does it convert? If not, then don't worry about redirecting, there wouldn't be any point, only spam signals
-
If there is some good traffic, then do IP limitations, hostnames limitations etc. That can be done in htaccess or on the server itself. There are other more elaborate ways to filter out spam traffic as well, but that depends on how you or your IT guy is familiar with it. One of the simplest solutions is to route all traffic through CloudFlare, it has quite nice spam filtering, and it's free.
Hope this helps.
-
-
Thank you- we're talking about murrayroofinllc.com in particular- we are not sure how to forward the old domain to the new- We "know how" we just don't know if we should- The reason we developed murrayroofingllc.com is because murray roofing.com had a high spam score and we got advice from this string to go for a new domain-
Now the concern is- if we forward all the traffic from murrayroofing.com to murrayroofingllc.com that the new domain murrayroofingllc.com will be negatively affected by the spammy traffic- Somehow murrayroofing.com got on some spam sites and we get a ton of spammy traffic from china- we don't want this traffis - and these sites there is "no way" to ask them to remove our website from their spam sites in china.
All thoughts are welcome here-
-
Ta Larry
Ok nothing much of substance, that said if ranking worth trying as it is an easier or usually faster route to page 1.
Had a look at the Murray Roofing site and has not been optimised for customer queries a roofing contractor would seek to rank for. As it seems you are keen to start afresh - can do both in parallel. No harm to either.
That said would suggest you also look at your google my business structure - your effectively a local play. Getting reviews and appearing in the local search pack for roofing contractors Omaha etc we would consider a client priority.
All the best go get them.
-
only for a few and we are in position 49 and 50 for them.
-
Hi
Is the current site ranking for any terms of value?
-
Hi there,
Yes, absolutely get new domain. If you look at DA - it's only 15 (not too bad in some cases). But if you look at backlink profile - you'll see that most of the links are from listing sites - homestead, yellowpages, ezlocal etc. You can replicate that profile after a day of work. And, as you said, spam score will only bring troubles.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does my old brand name still show up on organic search but as my new brand name and domain?
Hello mozers! I have quite the conundrum. My client used to have the unfortunate brand name "Meetoo" - which by the way they had before the movement happened! So naturally, they rebranded to the name Vevox in March 2019 to avoid confusion to users. However, when you search for their old brand name "Meetoo" the first organic link that pops up is their domain www.vevox.com. Now, this wouldn't normally be a problem, however it is when any #MeToo news appears in the media and we get a sudden influx or wrong traffic. I've searched the HTML and content for the term "Meetoo" but can only find one trace of this name through a widget. Not enough to hold an organic spot. My only other thinking is that www.vevox.com is redirected from www.meetoo.com. So I'm assuming this is why Vevox appear under the search term "Meetoo". How can I remove the homepage www.vevox.com from appearing for the search term "meetoo"? Can anyone help? AvGGYBc
Intermediate & Advanced SEO | | Virginia-Girtz3 -
Should I run my Shopify store on a subdomain or buy a new domain for it?
I'm planning to set up a subdomain for my Shopify store but I'm not sure if this is the right approach. Should I purchase a separate domain for it? I'm running Wordpress on my website and want to keep it that way. I want to use Shopify for the ecommerce side. I want to link the store from the top nav and of course I'll use CTA's in a variety of ways to point to merchandise and other things on the store side. Thanks for any help you can offer.
Intermediate & Advanced SEO | | ims20160 -
Move domain to new domain, for how much time should I keep forwarding?
I'm not sure but my website looks like is not getting it's juice as supposed to be. As we already know, google preferred https sites and this is what happened to mine, it was been crawling as https but when the time came to move my domain to new domain, I used 301 or domain forwarding service, unfortunately they didn't have a way to forward from https to new https, they only had regular http to https, when users clicked to my old domain from google search my site was returned to "site does not exist", I used hreflang at least that google would detect my new domain been forwarding and yes it worked but now I'm wondering, for how much time should I keep the forwarding the old domain to the new one, my site looks like is not going up, I have changed all the external links, any help would be appreciated. Thanks!
Intermediate & Advanced SEO | | Fulanito1 -
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
Intermediate & Advanced SEO | | 945010 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
Changing domain for a magento store
Hi all, wondering if i could gather some views on the best approach for this please... We currently have a magento site up with about 150,000 pages (although only 9k indexed in Google as product pages are set to no index by default until the default manufacturer description has been rewritten). The indexed pages are mainly category pages, filtering options and a few search results. While none of the internal pages have massive DA - seem to average about 18-24 which isn't too bad for internal pages, I guess - I would like to transfer as much of this over to the new domain. My question is, is it really feasible to have an htaccess with about 10,000 301 redirects on the current domain? The server is pretty powerful so could probably serve the file without issue but would Google be happy with that? Would it be better to use the change url option in WMT instead. Ive never used that so not sure how that would work in this cause. Would it redirect users too? As a footnote, the site is changing because of branding reasons and not because of a penalty of the site. Thanks, Carl
Intermediate & Advanced SEO | | daedriccarl0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
When should you redirect a domain completely?
We moved a website over to a new domain name. We used 301 redirects to redirect all the pages individually (around 150 redirects). So my question is, when should we just kill the old site completely and just redirect (forward/point) the old domain over to the new one?
Intermediate & Advanced SEO | | co.mc0