Duplicate exact match domains flagged by google - need help reinclusion
-
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly?
Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend.
1. create a unique site template for each site
2. create unique content
any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
-
A few more questions
- Are they all hosted on the same server / IP address?
- Are they all registered by the same domain registra / same company or person?
- Is the content duplicated too? Or have you Spun the content?
- Are the links low quality?
- Do the sites all have the "same" links? (Have you submitted multiple templated sites to the same directories etc...)
-
EMDs, with all having same template and similar content and having tons of low quality links (with many linked through Keywords) even worse all those EMDs heavily interlinked might be he worst combination you can have!
-
I'm curious to know if this is has been resolved. Like others here, I've been successful on several re-inclusions, so here is my two bits:
Template duplication is probably not related to the manual penalty.
Duplicate content will definitely tank you sites (but may not come through as a "violation" email in WMT)
The culprit is almost certainly backlinks. The bonus is that you have access to all sites - were any not penalized? When you got your backlinks, did you pay for packages across multiple domains?
When I do re-inclusions for agencies that handle multiple clients, I compile the database of backlinks using Open Site Explorer and the External Links section of WMT and then use excel function to find what links are common to all penalized sites and which are not.
By doing so you can really highlight the backlinks that are most probable for causing red flags. Remove those links, and write Google. Just tell them you understand you have bad links, and that you've removed as many as possible. List the links right in your email.
-
Why don't you just come up with another scheme to make money!
This one seems to have run its course. Be happy with the money you made and move on.
-
If you made a good return on these sites, dont spin/spun or do anything like that.
- Replace all the content of the sites with good, high quality content. This will make them stick for ever.
- Keep the design, if its perfect in terms of seo. (i dont think it effect the ranking to have the same design on all sites..)
- You can link related cities to each other, after you have done 1 and 2 on the sites.
Stop doing black hat and keep it real. All blackhat will come back and bit you in the a**.
Good luck
-
Still mixing a lot of fact and fiction. Exact match domains still work. People have been saying that Google is 'going after' them, but that's not really true. There are plenty of exact match domains out there that are legitimate and separating the good guys from the bad guys is not going to happen at the domain level, doesn't make sense. Also, please keep in mind what the duplicate penalty content is for and how it's used. Having you site drop after an algo shift is not uncommon. I know plenty of domain farmers (in the 2 million+ domain range) and the biggest issues they face is bad rankings because of lack of content, overly aggressive link building, link wheels, etc).
I believe that a warning in GWT means it's been flagged for a manual review, but might be totally wrong about that. In the 15 years that I have been doing this, I have never been warned or received an email about unnatural link building.
I am currently chasing a link spammer/domain squatter right now so I can attest that those methods still work just fine if you know what you're doing. Once your site makes it in front of a human, its simple to figure out what other sites you have and what you are up to. Setting up single sites with different WHOIS info, different CBlocks, unique links, unique content, is extremely time consuming.
I have found that I make a lot more money focusing on a handful of sites and making them as best as possible, than I do managing 100s of sites with spun content, crappy links etc.
Start thinking out side the box to get real visitors coming to the site and interacting, social signals are a huge part of search. Look at microformats and push reviews, give people a reason to interact and refer other people. City directories and sites like that still have a place in the search engines.
If these sites really make money, I would go after local search because of the city name. I guess it depends on what you think a lot of money is. If all else fails, find a local biz that sells or caters to your KW and try to sell them the domain, especially if its already getting traffic.
-
yes, I also want to see
-
I have to agree with some of the other responses. Google is really cracking down on duplicate content and code. It is time consuming to to write new content and redesign the sites, but you have to weigh the costs. If the site is your livelihood then is worth working on it to get back in Google's good books. If it is a site that you have created to get link juice then maybe you shouldn't put the time and effort into it.
-
If you want to target multiple locations then do it properly, do not duplicate your site over ever state, rather create a unique website with its own content that is more relevant to the location you are targeting. If they are unique enough then Google may consider them as you have put the time and effort into making them different. This also includes local phone numbers in each state to show that you are actually in the state.
As for "What does a user in Montana care if the website looks the same as the one in New York?" - The problem is that local search may not dominate the right way in every state so the poor user in California see's 7 sites that all lead to the same place. This now becomes spam that annoys the hell out of the user. It is all about quality, 50+ pages all leading to the same place will result in a low quality experience to the users therefor will likely be denied.
-
Link profile and exact match. I was able to grab the equivalent of plumberlosangeles in every major city in the country. They were #1 for new york, la, houston, philly, etc.....This was a 7 figure gross generating adventure.
-
What value? Same as ServiceMagic for instance...........We have as much right to be in a city as a local 'plumber'. What does a user in Montana care if the website looks the same as the one in New York?
-
They're split up over 5-6 hosts...but identical WHOIS info......so there's maybe 20-30 per hosting account.
-
No, they were red flagged on Webmaster Tools. It says your site has violated google guidelines.
I'm guessing that means its a manual penalty?
-
Not the best idea, Google will hunt you down and shut you down, my guess is that they may even track you over multiple Google accounts and closely watch any related material. This is black hat all the way, I spend my life reporting pages like yours and get them shut down as quick as possible, they are viruses and have no reason to be in the results as they offer absolutely nothing to user experience. The only thing they give value to is your own pocket. There are Google Quality experts that are likely watching your sites like a hawk, my guess is that your re-inclusion requests will never be granted. I don't mean to be so blunt.. actually I do. What value are your bringing to the users?
-
Yeah I would like to see any pointers towards #2, my competition is using these black hat techniques and we would like to see the best way to compete with them!
-
I would also start by identifying the sites that brought the most traffic or money, I presume you running Adsense on these sites (side note also remember to have less ads above the fold since last layout update from Google).
The sites that are performing the best would be the ones I start with first when it comes to redesigning and making sure there is no further inter linking to the other sites.
Also make sure they are different CLASS C IP's.
-
One question, are you using seo hosting also or they are all hosted in the same IP also?
-
Friend, dont spin or what ever, Create unique and useful content. Pandas crunch you...
-
Guilty!... and it takes one to know one.
You may have little chance of redeeming those sites. 301 redirects seem to wipe the crap clean. So you get the traffic and some of the link juice.
Or...
Pass the website on. Give it to someone else... Change the registration.
..... As an added note, I once heard Matt Cutts say, "Why would you want to buy one of these (burned) sites? Why wouldn't you start over with a new site? Then at least you are at ground zero instead of in a hole."
-
Google beat up real estate sites for the same reason. City + homes for sale type sites where all that changed was the city, especially where the sites would create mini fact pages for each city to try and pull rank for that area keyword search.
Too many sites have used this technique and the market is saturated with such low quality sites. Niche directories focusing on one city are better, but you need to watch the boiler plate content and also the listings you have need to contribute more to the conversation that just name, address telephone like the yellow pages.
I doubt that Google is penalizing you for using the sale template, as mentioned so many sites use WordPress themes where the code structure is identical, and they are doing fine. I am curious if you are interlinking the websites.
Good content, usefulness and fresh contributions to the site will always be a good recipe. Fix and request re-inclusion and your shouldn't have a problem.
-
Hi Ilya,
It's been a few months since your original question. Are you still looking for some advice regarding this? Or do you have something you could share with us about what you did and let us know what we should do or not do if we're in a similar situation?
Thanks!
-
Are all of the domains being hosted under the same IP or have the same WHOIS info? If so then I suggest you use different hosting providers (hence difference IP's) and change up the whois info for each of the domains (you can even use private).
-
IMHO this is absolutely a content issue - opposed to spinning the content my suggestion would be to spent a little bit of $$ on having unique articles written targeting your top-level head and body keywords on each domain. You can get unique articles created for as little as $3 each using Odesk, Guru, or similar services - at this point I think spinning the content is not a great approach as Google has already recorded that these sites contain duplicate content and have been flagged as such, spent the money on some real content, re-work your post and page titles to reflect the new unique content, re-generate your sitemaps to include the new unique titles, and then re-submit
-
it is time for you to spin your content and try and make the sites distinct from each other. Try it then ping the sites. If nothing changes after a month you will have to write a love letter to google.
-
No, I haven't. Thanks for sharing, David! I guess I'm doing it first thing tomorrow.
-
You may have already seen this article from a few weeks back, but it is closely related to what you are thinking about doing with the domains:
http://searchenginewatch.com/article/2086693/Exact-Match-Domains-Can-Double-Clicks-on-PPC-Ads
Good luck!
-
A while ago a business manager I work with purchased 30 exact match domains, willing to use them for "SEO purposes". I actually poked around for ideas here too. At the end I decided I can't use them at all and got them redirected. The advice I got here was to focus my effort on the main site. What I plan to use them for at this point is AdWords. I will try to boost the CTR. If that does nto work, I'm 301 tem to the main site again.
My plan is to grab the domains, put a canonical link element on the whole domain, since it will duplicate the main site. Since AdWords URLs tend to get pasted into forum threads, blog post and social networks, I'm kind of hoping for a collateral effect in terms of backlinks.
Even if you have your sites reincluded, there is no guarantee that they won't go down again pretty soon. This seems a short term strategy anyway. Get the good URLs redirected to you main site and focus your effort there.
-
i would consider adding social features such as seo-able comments/forums/disqus etc. to help build each sites "local"ness in kw's. get each site as far away from the "template" mindset as you can.
in terms of "templates" keep in mind that google is not looking at template code for the most part. they're looking at the content (stuff in text) on your site and it's relevance.
i'd also consider shelving 260 of those domains, and making 10 really great sites.
best of luck!
-
Recommend the 2 points you mentioned. Of course, followed with a dose of good old backlinks and backed by some social signals as well. Yes its more work but it will you see better ROI in the long run.
Also, echoing other users here, template is not an issue at all - its what you put on it - the content, titles and other meta tags that needs to be unique especially if you are a low-mid level authority. Very high level authority sites can get away with duplicate content as Google sometimes tends to blindly trust them over the original source.
-
They have been attacked time and time again - its not happened yet, and MC keeps saying they are looking at it. Quite a few took a bump when I put that comment out, but weeks later most were back in the index.
-
Rishi -
Can you recommend a good resource that supports point number 2? I'd love to learn more about what they are actually doing to downgrade exact match domains.
Thanks
-
Google hates duplicate content, so much so that you will have problems with two urls, let alone 270 of them. Don't bother trying anything funny, like spinning the articles, at least for now. When you submit your site for re-inclusion it will be examined by humans, not robots. You need to have a good story and a nice clean site if you want to get back into the game. Being that your sites are 98 percent duplicate content, that really means you have only one, maybe two, websites at best. I would make 301 redirects from all your other websites to your best performing website. This will show google that you have solved your duplicate content problems while keeping all the links from the other sites. Good luck! dmac
-
There's no need to worry about the templates... But your content sounds 99% dupe across all of your 270+ sites. I'd start with rewriting the pages for your best converting sites and take it one by one from there. Not sure if this is in fact a manual penalty, are your URLs still apearing in the SERPS when you search for them?
-
Sorry - I had to giggle when I first read your question
1. Google has made changes to algo that is finding a number of these sites , especially ones with dupe content
2. They are in the process of downgrading the value of exact match domains.
My advice? If you are adamant n getting these sites back up, take one - customise the template, make sure its not linked to or or linking to any of your others - rewrite the content and make it unique, and then send in a reinclusion request to google, outlining what you did and WHY you did it...
-
Sorry if i didn't explain
the sites are all service business-related. THey're not ad-sense in any way. They all do haev 100% identical content except for some keyword replacement where the city keyword is. Each site is related to a different city, for example plumber + los angeles, or plumber + new york. Each site has a different phone number, so I AM servicing those areas, however my problem is the duplicate content. The link profiles are very different.
-
Okay, first of all, I don't think penaltyis caused by the duplicate templates (just think of all the wordpress sites using the same template!!)
Possible reasons for the penalty:
- Duplicate content (spin articles much?)
- Low quality links (link farms? paid links?)
The reasons above might all contributes to the reason why your sites are penalised.
From the sound of things, it seems like you are doing some black hat stuff here, milking adsense and not providing unique and useful content to users. I am afraid this would not be a quick fix, it's hard to justify how your 270+ domains will become relevant and helpful resources overnight!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog archive pages are meta noindexed but still flagged as duplicate
Hi all. I know there several threads related to noindexing blog archives and category pages, so if this has already been answered, please direct me to that post. My blog archive pages have preview text from the posts. Each time I post a blog, the last post on any given archive page shifts to the first spot on the next archive page. Moz seems to report these as new duplicate content issues each week. I have my archive pages set to meta noindex, so can I feel good about continuing to ignore these duplicate content issues, or is there something else I should be doing to prevent penalties? TIA!
Technical SEO | | mkupfer1 -
Purchased domain with links - redirect page by page or entire domain?
Hi, I purchased an old domain with a lot of links that I'm redirecting to my site. I want all of their links to redirect to the same page on my site so I can approach this two different ways: Entire site
Technical SEO | | ninel_P
1.) RedirectMatch 301 ^(.*)$ http://www.xyz.com or Page by page
2). Redirect 301 /retiredpage.html http://www.xyz.com/newpage.html Is there a better option I should go with in regards to SEO effectiveness? Thanks in advance!0 -
Need Help On Proper Steps to Take To De-Index Our Search Results Pages
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of. So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed. By tag pages I mean: Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to? So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can someone comment on what might be the best, safest, or fastest route? Thanks so much for any help you might offer me!! Craig So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can you tell me which would be the best, fastest and safest routes?
Technical SEO | | TheCraig0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
We have over 3000 duplicate page titles, please help!
Hi, we did a crawl report and have over 3000 duplicate page titles. I'm not sure why this is happening... could it be because we have put posts in multiple categories? Can anyone help us with a quick fix? our site is www.stayathomemum.com.au thank you kindly, Chris
Technical SEO | | stayathomemum0 -
Help with pages Google is labeling "Not Followed"
I am seeing a number of pages that I am doing 301 redirects on coming up under the "Not Followed" category of the advanced index status in google webmasters. Google says this might be because the page is still showing active content or the redirect is not correct. I don't know how to tell if the page is still showing active content, and if someone can please tell me how to determine this it would be greatly appreciated. Also if you can provide a solution for how to adjust my page to make sure that the content is not appearing to be active, that would be amazing. Thanks in advance, here is a few links to pages that are experiencing this: www.luxuryhomehunt.com/homes-for-sale/sunnyisles.html www.luxuryhomehunt.com/homes-for-sale/summerield.html
Technical SEO | | Jdubin0 -
301 from old domain to new domain
Hi, I need to create a 301 redirect for all internal pages located on organic7thheaven.com to the homepage of our new site at http://www.7thheavennaturals.com/ Currently internal pages of the old site such as the following are returning a page not found www.organic7thheaven.com/products/deepcleansing/miraclemud.asp Can anyone help me in setting up a .htaccess file for this problem please? Thanks
Technical SEO | | MJMarketing0 -
Homepage/Root domain de-indexed by Google
This morning I discovered that the homepage/root domain of our company site, http://www.collegeplus.org/, has been de-indexed by Google and Bing. Out IT dept. is claiming it's our fault because we changed the meta title on our homepage. But they will not give me access to GWT to see if there's any issues. I believe the issue lies within our robots.txt file - http://www.collegeplus.org/robots.txt I also don't believe we're suffering a penalty because all of our tier 2 pages are still indexed when any type of branded search is performed. We don't do things that can get a site de-indexed like this. Any ideas on what the issue may be? Or at least something to convince our IT dept. that simply changing a meta title won't get your homepage totally de-indexed? Thanks.
Technical SEO | | explorionary0