Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The use of a ghost site for SEO purposes
-
Hi Guys,
Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site.
After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com.
It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain?
Does anyone have any experience of this as a tactic?
Thanks,
Dan
-
The appropriate solution in this place would be to employ the hreflang tag to related the two geographically separate sites together. However, before you take that step, I would look to make sure the previous SEO company which created the .com did not point any harmful links at the .com domain which would make it inadvisable to connect the two sites together. Use OpenSiteExplorer to look at the backlink profile and use Google Webmaster Tools to authorize the .com site and look for any manual penalty notices. If all looks clear, go ahead forward with the implementation of the hreflang tag.
Good luck and feel free to ask more questions here!
-
Thanks Both,
Pretty much confirms our thoughts here and yes Eddie - it appears to be a smash and grab job.
Dan
-
Certainly sounds dodgy, but suddenly removing all of those backlinks might cause you some SEO issues.
Depending on how Google is currently reading your site it may improve as your site would seem less spammy without them or it may really hurt the site (at least to start with) loosing that many back links would maybe make Google think something is up with your site?
I would take the bullet and remove the duplicate content but warn your clients that it may take a while for the natural benefits to come through. Because if your site isn't penalised yet for having that many dodgy backlinks and duplicate content it soon will be!
-
Certainly seems like the wrong thing to do. A good test is that if you think it may be dodgy it probably is. I certainly wouldn't recommend it as a tactic. There are potentially multiple issues with this....duplicate content as you mentioned but also dilution of real links. Good quality legitimate links could link to the Ghost site and therefore not count for the real site.
I have seen issues where it is a legitimate attempt to have a .com and .co.uk on the same shop and ended up with both versions online due to incompetent development but I didn't have to deal with cleaning it up.
Un-picking that could be messy. A good example of quick fix SEO for a fast buck I suspect.
-
5mil+ links?! Wow!
What's their spam score? I'm surprised they are not blocked or something
To answer your question - what does common sense tells you? The job of google and google bots is pretty much based on common sense. So, duplicate content website, ridiculous amount of links, no referral traffic - all these are obvious signals to run, Forrest, run!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What if i dont use an H1, but rather, h2 with multiple keywords.
the reason i dont want to use h1 is because i can have only one h1, however if i use several h2s. is it gonna help me rank? bacause google likes h1 more than h2, is google gonna give more priority or same priority to h2., and if that priority is gonna be less, what will be the percentage of that lessness? for ex: h1 gets 90 score if my h1 is missing how much score my h2 will get out of hundred(i know there is no such metric but i am just wondering anyways)
White Hat / Black Hat SEO | | Sam09schulz0 -
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
White H1 Tag Hurting SEO?
Hi, We're having an issue with a client not wanting the H1 tag to display on their site and using an image of their logo instead. We made the H1 tag white (did not deliberately hide with CSS) and i just read an article where this is considered black hat SEO. https://www.websitemagazine.com/blog/16-faqs-of-seo The only reason we want to hide it is because it looks redundant appearing there along with the brand name logo. Does anyone have any suggestions? Would putting the brand logo image inside of an H1 tag be ok? Thanks for the help
White Hat / Black Hat SEO | | AliMac261 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Title Tag : use comma, pipe or colon (:)
Hi, If Title has two and three keywords then which one is better option to separate them either with comma or pipe or colon. Example : Arvixe Review, Coupons (Jun 2015) and Uptime Report (I used (,) as a separator) Arvixe Review is primary keywords and Coupons and Uptime are secondary keywords. Aim is rank on keywords like Arvixe Review, Arvixe Coupons and Arvixe Uptime.
White Hat / Black Hat SEO | | gamesecure
Also, including current month and year with Title tag and it will change every month. Its means every month our title is changed.
Is this effect in SEO? Suggest best possible title for keywords like Arvixe Review, Coupons (Jun 2015) and Uptime Report. Rajiv0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100