Duplicate exact match domains flagged by google - need help reinclusion
-
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly?
Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend.
1. create a unique site template for each site
2. create unique content
any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
-
A few more questions
- Are they all hosted on the same server / IP address?
- Are they all registered by the same domain registra / same company or person?
- Is the content duplicated too? Or have you Spun the content?
- Are the links low quality?
- Do the sites all have the "same" links? (Have you submitted multiple templated sites to the same directories etc...)
-
EMDs, with all having same template and similar content and having tons of low quality links (with many linked through Keywords) even worse all those EMDs heavily interlinked might be he worst combination you can have!
-
I'm curious to know if this is has been resolved. Like others here, I've been successful on several re-inclusions, so here is my two bits:
Template duplication is probably not related to the manual penalty.
Duplicate content will definitely tank you sites (but may not come through as a "violation" email in WMT)
The culprit is almost certainly backlinks. The bonus is that you have access to all sites - were any not penalized? When you got your backlinks, did you pay for packages across multiple domains?
When I do re-inclusions for agencies that handle multiple clients, I compile the database of backlinks using Open Site Explorer and the External Links section of WMT and then use excel function to find what links are common to all penalized sites and which are not.
By doing so you can really highlight the backlinks that are most probable for causing red flags. Remove those links, and write Google. Just tell them you understand you have bad links, and that you've removed as many as possible. List the links right in your email.
-
Why don't you just come up with another scheme to make money!
This one seems to have run its course. Be happy with the money you made and move on.
-
If you made a good return on these sites, dont spin/spun or do anything like that.
- Replace all the content of the sites with good, high quality content. This will make them stick for ever.
- Keep the design, if its perfect in terms of seo. (i dont think it effect the ranking to have the same design on all sites..)
- You can link related cities to each other, after you have done 1 and 2 on the sites.
Stop doing black hat and keep it real. All blackhat will come back and bit you in the a**.
Good luck
-
Still mixing a lot of fact and fiction. Exact match domains still work. People have been saying that Google is 'going after' them, but that's not really true. There are plenty of exact match domains out there that are legitimate and separating the good guys from the bad guys is not going to happen at the domain level, doesn't make sense. Also, please keep in mind what the duplicate penalty content is for and how it's used. Having you site drop after an algo shift is not uncommon. I know plenty of domain farmers (in the 2 million+ domain range) and the biggest issues they face is bad rankings because of lack of content, overly aggressive link building, link wheels, etc).
I believe that a warning in GWT means it's been flagged for a manual review, but might be totally wrong about that. In the 15 years that I have been doing this, I have never been warned or received an email about unnatural link building.
I am currently chasing a link spammer/domain squatter right now so I can attest that those methods still work just fine if you know what you're doing. Once your site makes it in front of a human, its simple to figure out what other sites you have and what you are up to. Setting up single sites with different WHOIS info, different CBlocks, unique links, unique content, is extremely time consuming.
I have found that I make a lot more money focusing on a handful of sites and making them as best as possible, than I do managing 100s of sites with spun content, crappy links etc.
Start thinking out side the box to get real visitors coming to the site and interacting, social signals are a huge part of search. Look at microformats and push reviews, give people a reason to interact and refer other people. City directories and sites like that still have a place in the search engines.
If these sites really make money, I would go after local search because of the city name. I guess it depends on what you think a lot of money is. If all else fails, find a local biz that sells or caters to your KW and try to sell them the domain, especially if its already getting traffic.
-
yes, I also want to see
-
I have to agree with some of the other responses. Google is really cracking down on duplicate content and code. It is time consuming to to write new content and redesign the sites, but you have to weigh the costs. If the site is your livelihood then is worth working on it to get back in Google's good books. If it is a site that you have created to get link juice then maybe you shouldn't put the time and effort into it.
-
If you want to target multiple locations then do it properly, do not duplicate your site over ever state, rather create a unique website with its own content that is more relevant to the location you are targeting. If they are unique enough then Google may consider them as you have put the time and effort into making them different. This also includes local phone numbers in each state to show that you are actually in the state.
As for "What does a user in Montana care if the website looks the same as the one in New York?" - The problem is that local search may not dominate the right way in every state so the poor user in California see's 7 sites that all lead to the same place. This now becomes spam that annoys the hell out of the user. It is all about quality, 50+ pages all leading to the same place will result in a low quality experience to the users therefor will likely be denied.
-
Link profile and exact match. I was able to grab the equivalent of plumberlosangeles in every major city in the country. They were #1 for new york, la, houston, philly, etc.....This was a 7 figure gross generating adventure.
-
What value? Same as ServiceMagic for instance...........We have as much right to be in a city as a local 'plumber'. What does a user in Montana care if the website looks the same as the one in New York?
-
They're split up over 5-6 hosts...but identical WHOIS info......so there's maybe 20-30 per hosting account.
-
No, they were red flagged on Webmaster Tools. It says your site has violated google guidelines.
I'm guessing that means its a manual penalty?
-
Not the best idea, Google will hunt you down and shut you down, my guess is that they may even track you over multiple Google accounts and closely watch any related material. This is black hat all the way, I spend my life reporting pages like yours and get them shut down as quick as possible, they are viruses and have no reason to be in the results as they offer absolutely nothing to user experience. The only thing they give value to is your own pocket. There are Google Quality experts that are likely watching your sites like a hawk, my guess is that your re-inclusion requests will never be granted. I don't mean to be so blunt.. actually I do. What value are your bringing to the users?
-
Yeah I would like to see any pointers towards #2, my competition is using these black hat techniques and we would like to see the best way to compete with them!
-
I would also start by identifying the sites that brought the most traffic or money, I presume you running Adsense on these sites (side note also remember to have less ads above the fold since last layout update from Google).
The sites that are performing the best would be the ones I start with first when it comes to redesigning and making sure there is no further inter linking to the other sites.
Also make sure they are different CLASS C IP's.
-
One question, are you using seo hosting also or they are all hosted in the same IP also?
-
Friend, dont spin or what ever, Create unique and useful content. Pandas crunch you...
-
Guilty!... and it takes one to know one.
You may have little chance of redeeming those sites. 301 redirects seem to wipe the crap clean. So you get the traffic and some of the link juice.
Or...
Pass the website on. Give it to someone else... Change the registration.
..... As an added note, I once heard Matt Cutts say, "Why would you want to buy one of these (burned) sites? Why wouldn't you start over with a new site? Then at least you are at ground zero instead of in a hole."
-
Google beat up real estate sites for the same reason. City + homes for sale type sites where all that changed was the city, especially where the sites would create mini fact pages for each city to try and pull rank for that area keyword search.
Too many sites have used this technique and the market is saturated with such low quality sites. Niche directories focusing on one city are better, but you need to watch the boiler plate content and also the listings you have need to contribute more to the conversation that just name, address telephone like the yellow pages.
I doubt that Google is penalizing you for using the sale template, as mentioned so many sites use WordPress themes where the code structure is identical, and they are doing fine. I am curious if you are interlinking the websites.
Good content, usefulness and fresh contributions to the site will always be a good recipe. Fix and request re-inclusion and your shouldn't have a problem.
-
Hi Ilya,
It's been a few months since your original question. Are you still looking for some advice regarding this? Or do you have something you could share with us about what you did and let us know what we should do or not do if we're in a similar situation?
Thanks!
-
Are all of the domains being hosted under the same IP or have the same WHOIS info? If so then I suggest you use different hosting providers (hence difference IP's) and change up the whois info for each of the domains (you can even use private).
-
IMHO this is absolutely a content issue - opposed to spinning the content my suggestion would be to spent a little bit of $$ on having unique articles written targeting your top-level head and body keywords on each domain. You can get unique articles created for as little as $3 each using Odesk, Guru, or similar services - at this point I think spinning the content is not a great approach as Google has already recorded that these sites contain duplicate content and have been flagged as such, spent the money on some real content, re-work your post and page titles to reflect the new unique content, re-generate your sitemaps to include the new unique titles, and then re-submit
-
it is time for you to spin your content and try and make the sites distinct from each other. Try it then ping the sites. If nothing changes after a month you will have to write a love letter to google.
-
No, I haven't. Thanks for sharing, David! I guess I'm doing it first thing tomorrow.
-
You may have already seen this article from a few weeks back, but it is closely related to what you are thinking about doing with the domains:
http://searchenginewatch.com/article/2086693/Exact-Match-Domains-Can-Double-Clicks-on-PPC-Ads
Good luck!
-
A while ago a business manager I work with purchased 30 exact match domains, willing to use them for "SEO purposes". I actually poked around for ideas here too. At the end I decided I can't use them at all and got them redirected. The advice I got here was to focus my effort on the main site. What I plan to use them for at this point is AdWords. I will try to boost the CTR. If that does nto work, I'm 301 tem to the main site again.
My plan is to grab the domains, put a canonical link element on the whole domain, since it will duplicate the main site. Since AdWords URLs tend to get pasted into forum threads, blog post and social networks, I'm kind of hoping for a collateral effect in terms of backlinks.
Even if you have your sites reincluded, there is no guarantee that they won't go down again pretty soon. This seems a short term strategy anyway. Get the good URLs redirected to you main site and focus your effort there.
-
i would consider adding social features such as seo-able comments/forums/disqus etc. to help build each sites "local"ness in kw's. get each site as far away from the "template" mindset as you can.
in terms of "templates" keep in mind that google is not looking at template code for the most part. they're looking at the content (stuff in text) on your site and it's relevance.
i'd also consider shelving 260 of those domains, and making 10 really great sites.
best of luck!
-
Recommend the 2 points you mentioned. Of course, followed with a dose of good old backlinks and backed by some social signals as well. Yes its more work but it will you see better ROI in the long run.
Also, echoing other users here, template is not an issue at all - its what you put on it - the content, titles and other meta tags that needs to be unique especially if you are a low-mid level authority. Very high level authority sites can get away with duplicate content as Google sometimes tends to blindly trust them over the original source.
-
They have been attacked time and time again - its not happened yet, and MC keeps saying they are looking at it. Quite a few took a bump when I put that comment out, but weeks later most were back in the index.
-
Rishi -
Can you recommend a good resource that supports point number 2? I'd love to learn more about what they are actually doing to downgrade exact match domains.
Thanks
-
Google hates duplicate content, so much so that you will have problems with two urls, let alone 270 of them. Don't bother trying anything funny, like spinning the articles, at least for now. When you submit your site for re-inclusion it will be examined by humans, not robots. You need to have a good story and a nice clean site if you want to get back into the game. Being that your sites are 98 percent duplicate content, that really means you have only one, maybe two, websites at best. I would make 301 redirects from all your other websites to your best performing website. This will show google that you have solved your duplicate content problems while keeping all the links from the other sites. Good luck! dmac
-
There's no need to worry about the templates... But your content sounds 99% dupe across all of your 270+ sites. I'd start with rewriting the pages for your best converting sites and take it one by one from there. Not sure if this is in fact a manual penalty, are your URLs still apearing in the SERPS when you search for them?
-
Sorry - I had to giggle when I first read your question
1. Google has made changes to algo that is finding a number of these sites , especially ones with dupe content
2. They are in the process of downgrading the value of exact match domains.
My advice? If you are adamant n getting these sites back up, take one - customise the template, make sure its not linked to or or linking to any of your others - rewrite the content and make it unique, and then send in a reinclusion request to google, outlining what you did and WHY you did it...
-
Sorry if i didn't explain
the sites are all service business-related. THey're not ad-sense in any way. They all do haev 100% identical content except for some keyword replacement where the city keyword is. Each site is related to a different city, for example plumber + los angeles, or plumber + new york. Each site has a different phone number, so I AM servicing those areas, however my problem is the duplicate content. The link profiles are very different.
-
Okay, first of all, I don't think penaltyis caused by the duplicate templates (just think of all the wordpress sites using the same template!!)
Possible reasons for the penalty:
- Duplicate content (spin articles much?)
- Low quality links (link farms? paid links?)
The reasons above might all contributes to the reason why your sites are penalised.
From the sound of things, it seems like you are doing some black hat stuff here, milking adsense and not providing unique and useful content to users. I am afraid this would not be a quick fix, it's hard to justify how your 270+ domains will become relevant and helpful resources overnight!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Old Domain Name with relevant domain name
We have a number of historical domain names that we are thinking of 301 redirecting to industry relevant domains.
Technical SEO | | barry.oneil
Currently the domains we wish to redirect are not active and have been down since march 2018.
As far as we know there is no bad reputation on these domains, but we think there are still links out there in the wild on possibly relevant blog posts. Would there be any negative affect on the target domain? Thanks0 -
Duplicates - How to know if trailing slashes are creating duplicate pages?
Hi, How do you determine whether trailing slashes are creating duplicate pages? Search Console is showing both /about and about/ for example but how do I know whether this is a problem? Thanks James
Technical SEO | | CamperConnect140 -
Help with Getting Googlebot to See Google Charts
We received a message from Google saying we have an extremely high number of URLs that are linking to pages with similar or duplicate content. The main difference between these pages are the Google charts we use. It looks like Google isn't able to see these charts (most of the text are very similar) and the charts (lots of it) are the main differences between these pages. So my question is what is the best approach to allowing Google to see the data that exists in these charts? I read from here http://webmasters.stackexchange.com/questions/69818/how-can-i-get-google-to-index-content-that-is-written-into-the-page-with-javascr that a solution would be to have the text that is displayed on the charts coded into the html and hidden by CSS. I'm not sure but it seems like a bad idea to have it seen by Google but hidden to the user by CSS. It just sounds like a cloaking hack. Can someone clarify if this is even a solution or is there a better solution?
Technical SEO | | ERICompensationAnalytics1 -
Duplicate content pages on different domains, best practice?
Hi, We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity. So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none. Thanks for taking the time in reading this.
Technical SEO | | seosogood0 -
Moving from www.domain.com/nameofblog to www.domain.com/blog
Describe your question in detail. The more information you give, the better! It helps give context for a great answer I have had my blog located at www.legacytravel.com/ramblings for a while. I now believe that, from an SEO perspective, it would be preferable to move it to www.legacytravel.com/blog. So, I want to be able to not lose any links (few though they may be) with the move. I believe I would need to do a 301 redirect in the htaccess file of www.legacytravel.com that will tell anyone who comes knocking on the door of www.legacytravel.com/ramblings/blah blah blah that now what they want is at www.legacytravel.com/blog/blah blah blah Is that correct? What would the entry look like in the htaccess? Thank you in advance.
Technical SEO | | cathibanks0 -
Page disappeared from Google index. Google cache shows page is being redirected.
My URL is: http://shop.nordstrom.com/c/converse Hi. The week before last, my top Converse page went missing from the Google index. When I "fetch as Googlebot" I am able to get the page and "submit" it to the index. I have done this several times and still cannot get the page to show up. When I look at the Google cache of the page, it comes up with a different page. http://webcache.googleusercontent.com/search?q=cache:http://shop.nordstrom.com/c/converse shows: http://shop.nordstrom.com/c/pop-in-olivia-kim Back story: As far as I know we have never redirected the Converse page to the Pop-In page. However the reverse may be true. We ran a Converse based Pop-In campaign but that used the Converse page and not the regular Pop-In page. Though the page comes back with a 200 status, it looks like Google thinks the page is being redirected. We were ranking #4 for "converse" - monthly searches = 550,000. My SEO traffic for the page has tanked since it has gone missing. Any help would be much appreciated. Stephan
Technical SEO | | shop.nordstrom0 -
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
We are working on a website that has some high-quality industry articles available on their website. For each article, there is an abstract with a link to the PDF which is hosted on the domain. We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article. Can we get any benefit from a direct PDF link? Or do we need to modify our strategy?
Technical SEO | | MattAaron0 -
Why do I see dramatic differences in impressions between Google Webmaster Tools and Google Insights for Search?
Has anyone else noticed discrepancies between these tools? Take keyword A and keyword B. I've literally seen situations where A has 3 or 4 times the traffic as B in Google Webmaster Tools, but half the traffic of B in Google Insights for Search. What might be the reason for this discrepancy?
Technical SEO | | ir-seo-account0