Why are these blackhat sites so successful?
-
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:"
http://dentalimplantsvaughan.ca - 9 (on google.ca)
http://dentalimplantsinhonoluluhi.com - 2 (on google.com)
http://dentalimplantssurreybc.ca - 7 (on google.ca)
These markets are not particularly competitive, however, all of these sites suffer from:
- Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier).
- Average speed score.
- No structured data
- No links
And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago.
But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
-
Not exactly. When it comes to different countries, like the example domains you listed above. (.com and .ca) Google allows for mirrored or duplicate sites by country.
When it comes to multiple sites in the same country, Google will give value to the first use of the content, then no value to the 2nd use. In the example you gave of San Diego and Atlanta, it is important to create unique content, citations and backlinks that are localized to that site's location.
I have a client that has two separate appliance companies in the same area and two separate websites. I've used some of the same general content on both, but have content that is unique to both, along with unique links, and they both rank really well.
-
I guess what baffles me is that there's duplicate, spammy content. Exactly what Google tells you to stay away from.
-
So, suppose a site is #1. It's for a bakery in Atlanta, Georgia. The content is doing really well. You're telling me that, as a bakery in San Diego CA, I can take that content, slap it on my site, replace the business name and location information, and it'd be okay?
-
Yes, if they are different businesses, they should be treated differently.
-
Even if those sites are for different practices/businesses?
-
I have similar sites I have reviewed. It comes down to the - exact marching url. Plus the space has to be not overly competitive. No doubt there is 600 other factors but the dominant standout is the url. Ironically once there tough to dislodge, unless the site dislodging is 10 x 's better - cannot be just twice as good... .
-
Not a bad looking site. Google does allow for duplicate content across multi-regional sites. And sometimes a new site will get an initial boost then drop back down. Also, if there is not a lot of localized website competition, Google will rank them as the most relevant for this category in Hawaii.
-
Interesting: view-source:http://dentalimplantssurreybc.ca/faqs/
view-source:http://dentalimplantssurreybc.ca/faqs/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will 301s to Amazon Hurt Site?
We have 155 podcasts and in many we have affiliate links to Amazon. But I recently found out that one of the two products we are promoting is no longer available. I now have to fix many podcast descriptions. My thought is maybe to build a link like: financiallysimple.com/camera and 301 it to the Amazon product. That way if the product changes, I simply change where the 301 points. Simple. BUT my question is does that bouncing people offsite immediately hurt us? Are there any other options that will accomplish the same goal?
Technical SEO | | jgoethert
Thanks!1 -
No follow links from ban sites
Hello boys and girls. I'm auditing my link profile and came across links pointing to my sites, from a ban site (site: domain.com gives back no result).yet the links are no follow. should i try to remove them all? i know no follow should be enough, yet my links are set in a bad neighborhood. what would you recommend??
Technical SEO | | Tit0 -
Managing international sites, best practises
This question follows on from my earlier question http://www.seomoz.org/q/how-to-replace-my-co-uk-site-with-my-com-site-in-the-us-google-results My client owns www.blindbolt.co.uk for the UK site and www.blindboltusa.com for their US site. They will shortly be having a new site for Australia. They have just acquired www.blindbolt.com and have expressed an interest in using this as the main hub for all of their sites, i.e. http://uk.blindbolt.com, http://aus.blindbolt.com. The current, existing sites (e.g. www.blindbolt.co.uk) could be 301'd to the new locations. Could I have your thoughts please on whether to go down this route of having international subdomains , vs keeping the sites on separate top level domains? What should I take into consideration? Is google smart enough to return different subdomain results in different countries? Many thanks!
Technical SEO | | OffSightIT0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
Site not indexing correctly
I am trying to figure out what is going on with my site listings. Google is only displaying my title and url - no description. You can see it when you search for Franchises for Sale. The site is www.franchisesolutions.com. Why could this happen? Also I saw a big drop off in a handful of keyword rankings today. Could this be related?
Technical SEO | | franchisesolutions0 -
Time on site
From what I understand, if you search for a keyword say "blue widgets" and you click on a result, and then spend 10 seconds there, and go back to google and click on a different result google will track that first result as being not very relevant. What I don't understand is what happens when (and this happens all the time, i did it today) you click on a result go to that page, find it (not?) relevant and then get distracted, phone call, or someone calls you into another room in the office. You end up accidentally leaving the tab open all day long, and never go back to the google search. So your time on site to google is what? infinity? there must be an upper cap here? at some point they must say, ok, the user is gone, time on site = our maximum = 5 minutes?!? Get me? any insight?
Technical SEO | | adriandg0 -
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
Technical SEO | | 540SEO
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command? See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file To ban all spiders from the entire site uncomment the next two lines: User-Agent: * Disallow: /0 -
Site revision
our site has complete redesign including site architecture, page url and page content (except domain). It looks like a new site. The old site has been indexed about thirty thousand results by google. now what should i do first?
Technical SEO | | jallenyang0