Why are these blackhat sites so successful?
-
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:"
http://dentalimplantsvaughan.ca - 9 (on google.ca)
http://dentalimplantsinhonoluluhi.com - 2 (on google.com)
http://dentalimplantssurreybc.ca - 7 (on google.ca)
These markets are not particularly competitive, however, all of these sites suffer from:
- Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier).
- Average speed score.
- No structured data
- No links
And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago.
But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
-
Not exactly. When it comes to different countries, like the example domains you listed above. (.com and .ca) Google allows for mirrored or duplicate sites by country.
When it comes to multiple sites in the same country, Google will give value to the first use of the content, then no value to the 2nd use. In the example you gave of San Diego and Atlanta, it is important to create unique content, citations and backlinks that are localized to that site's location.
I have a client that has two separate appliance companies in the same area and two separate websites. I've used some of the same general content on both, but have content that is unique to both, along with unique links, and they both rank really well.
-
I guess what baffles me is that there's duplicate, spammy content. Exactly what Google tells you to stay away from.
-
So, suppose a site is #1. It's for a bakery in Atlanta, Georgia. The content is doing really well. You're telling me that, as a bakery in San Diego CA, I can take that content, slap it on my site, replace the business name and location information, and it'd be okay?
-
Yes, if they are different businesses, they should be treated differently.
-
Even if those sites are for different practices/businesses?
-
I have similar sites I have reviewed. It comes down to the - exact marching url. Plus the space has to be not overly competitive. No doubt there is 600 other factors but the dominant standout is the url. Ironically once there tough to dislodge, unless the site dislodging is 10 x 's better - cannot be just twice as good... .
-
Not a bad looking site. Google does allow for duplicate content across multi-regional sites. And sometimes a new site will get an initial boost then drop back down. Also, if there is not a lot of localized website competition, Google will rank them as the most relevant for this category in Hawaii.
-
Interesting: view-source:http://dentalimplantssurreybc.ca/faqs/
view-source:http://dentalimplantssurreybc.ca/faqs/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Ads on A Site
We serve ads on our site using a sub-domain. All ads use a re-direct from ads.domain before redirecting users to the proper, normal, internal url. Most the content on our home page is ad block driven. Is it possible and does it make sense to enter the sub-domain as url parameter in Google Webmaster tools, letting Google know that this is something to be ignored. Many thanks
Technical SEO | | CeeC-Blogger0 -
Squidoo vs Personal Site
Hey guys I'm Nikolas a newb, just signed up to the pro membership trial after alot of digging on the seomoz blog for months . First off let me tell you alittle about my story and seo knowledge. I started off online on the well known squidoo site with revenue sharing, because of my day job I had alot of time to work on my articles and build up to a nice monthly salary of just over 1k in less than 5 months which doubled and trippled in the last few months. Seo is like a 6th sense to me , onpage offpage and the lots. Most of what I read here is not new to me or something I didn't already know about, but its good to freshen up and remember things, as theres alot to search engine optimization. I have built up to over 500k unique visitors in less than a year and have decided to move on to my own site 4 months ago. The niche is the exact same one I have targeted on squidoo. My site had alot of issues at the start the classic 301 redirection ht_access fix I had to do,content management system building low quality content pages via tags that i have fixed(noindex) and removed with 404s, build up original unique valuable posts, interlink ,onpage and offpage seo the basics I did for squidoo. The problem here is that I can't seem to get any traction from google where as my squidoo search engine traffic is 80% , my sites google traffic is 5-10%. I have the same number of articles on both sites, similar topics , similar onpage offpage optimisation basically identical but have alot better content on my new site. My bing, yahoo and referral traffic is rising everyday but as I know google is 85% of the market share I am leaving alot of money on the table. I hope that most of you more dedicated seo's can give me a tip or two and explain exactly what is going on with my situation and if possible take a look at my site hardwarepal .
Technical SEO | | NikolasNikolaou0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of migrating one of our retail sites over to magento. To our horror, we find out today that some pages are already being indexed by Google, and we have started receiving orders through new site. Do you have any suggestions for what may have caused this? Or similarly, what the best solution would be to de-index ourselves? We most recently excluded anything with a certain parameter from robots.txt - could this being implemented incorrectly have caused this issue? Thanks
Technical SEO | | Sayers0 -
How to find all the links to my site
hi i have been trying to find all the links that i have to my site http://www.clairehegarty.co.uk but i am not having any luck. I have used the open explorer but it is not showing all the links but when i go to my google webmaster page it shows me more pages than it does on the semoz tool. can anyone help me sort this out and find out exactly what links are going into my site many thanks
Technical SEO | | ClaireH-1848860 -
What to include on a sitemap for a huge site?
I have a very large site and I'm not sure what all to include on the sitemap page. We have categories such as items1, items2 and in the items1 category are 100 vendors with their individual vendor pages. Should I link all 100 vendor pages on the sitemap or just the main items1 category?
Technical SEO | | CFSSEO0 -
Google Webmaster Site Performance
In webmaster tools, under labs/site performance google provides your ave page load time. When google grades a page, does it use how long that specific page loads -or- Does google use the overall ave page load time for the domain as provided in lab/site performance
Technical SEO | | Bucky0 -
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
Technical SEO | | 540SEO
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command? See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file To ban all spiders from the entire site uncomment the next two lines: User-Agent: * Disallow: /0