Duplicate exact match domains flagged by google - need help reinclusion
-
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly?
Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend.
1. create a unique site template for each site
2. create unique content
any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
-
A few more questions
- Are they all hosted on the same server / IP address?
- Are they all registered by the same domain registra / same company or person?
- Is the content duplicated too? Or have you Spun the content?
- Are the links low quality?
- Do the sites all have the "same" links? (Have you submitted multiple templated sites to the same directories etc...)
-
EMDs, with all having same template and similar content and having tons of low quality links (with many linked through Keywords) even worse all those EMDs heavily interlinked might be he worst combination you can have!
-
I'm curious to know if this is has been resolved. Like others here, I've been successful on several re-inclusions, so here is my two bits:
Template duplication is probably not related to the manual penalty.
Duplicate content will definitely tank you sites (but may not come through as a "violation" email in WMT)
The culprit is almost certainly backlinks. The bonus is that you have access to all sites - were any not penalized? When you got your backlinks, did you pay for packages across multiple domains?
When I do re-inclusions for agencies that handle multiple clients, I compile the database of backlinks using Open Site Explorer and the External Links section of WMT and then use excel function to find what links are common to all penalized sites and which are not.
By doing so you can really highlight the backlinks that are most probable for causing red flags. Remove those links, and write Google. Just tell them you understand you have bad links, and that you've removed as many as possible. List the links right in your email.
-
Why don't you just come up with another scheme to make money!
This one seems to have run its course. Be happy with the money you made and move on.
-
If you made a good return on these sites, dont spin/spun or do anything like that.
- Replace all the content of the sites with good, high quality content. This will make them stick for ever.
- Keep the design, if its perfect in terms of seo. (i dont think it effect the ranking to have the same design on all sites..)
- You can link related cities to each other, after you have done 1 and 2 on the sites.
Stop doing black hat and keep it real. All blackhat will come back and bit you in the a**.
Good luck
-
Still mixing a lot of fact and fiction. Exact match domains still work. People have been saying that Google is 'going after' them, but that's not really true. There are plenty of exact match domains out there that are legitimate and separating the good guys from the bad guys is not going to happen at the domain level, doesn't make sense. Also, please keep in mind what the duplicate penalty content is for and how it's used. Having you site drop after an algo shift is not uncommon. I know plenty of domain farmers (in the 2 million+ domain range) and the biggest issues they face is bad rankings because of lack of content, overly aggressive link building, link wheels, etc).
I believe that a warning in GWT means it's been flagged for a manual review, but might be totally wrong about that. In the 15 years that I have been doing this, I have never been warned or received an email about unnatural link building.
I am currently chasing a link spammer/domain squatter right now so I can attest that those methods still work just fine if you know what you're doing. Once your site makes it in front of a human, its simple to figure out what other sites you have and what you are up to. Setting up single sites with different WHOIS info, different CBlocks, unique links, unique content, is extremely time consuming.
I have found that I make a lot more money focusing on a handful of sites and making them as best as possible, than I do managing 100s of sites with spun content, crappy links etc.
Start thinking out side the box to get real visitors coming to the site and interacting, social signals are a huge part of search. Look at microformats and push reviews, give people a reason to interact and refer other people. City directories and sites like that still have a place in the search engines.
If these sites really make money, I would go after local search because of the city name. I guess it depends on what you think a lot of money is. If all else fails, find a local biz that sells or caters to your KW and try to sell them the domain, especially if its already getting traffic.
-
yes, I also want to see
-
I have to agree with some of the other responses. Google is really cracking down on duplicate content and code. It is time consuming to to write new content and redesign the sites, but you have to weigh the costs. If the site is your livelihood then is worth working on it to get back in Google's good books. If it is a site that you have created to get link juice then maybe you shouldn't put the time and effort into it.
-
If you want to target multiple locations then do it properly, do not duplicate your site over ever state, rather create a unique website with its own content that is more relevant to the location you are targeting. If they are unique enough then Google may consider them as you have put the time and effort into making them different. This also includes local phone numbers in each state to show that you are actually in the state.
As for "What does a user in Montana care if the website looks the same as the one in New York?" - The problem is that local search may not dominate the right way in every state so the poor user in California see's 7 sites that all lead to the same place. This now becomes spam that annoys the hell out of the user. It is all about quality, 50+ pages all leading to the same place will result in a low quality experience to the users therefor will likely be denied.
-
Link profile and exact match. I was able to grab the equivalent of plumberlosangeles in every major city in the country. They were #1 for new york, la, houston, philly, etc.....This was a 7 figure gross generating adventure.
-
What value? Same as ServiceMagic for instance...........We have as much right to be in a city as a local 'plumber'. What does a user in Montana care if the website looks the same as the one in New York?
-
They're split up over 5-6 hosts...but identical WHOIS info......so there's maybe 20-30 per hosting account.
-
No, they were red flagged on Webmaster Tools. It says your site has violated google guidelines.
I'm guessing that means its a manual penalty?
-
Not the best idea, Google will hunt you down and shut you down, my guess is that they may even track you over multiple Google accounts and closely watch any related material. This is black hat all the way, I spend my life reporting pages like yours and get them shut down as quick as possible, they are viruses and have no reason to be in the results as they offer absolutely nothing to user experience. The only thing they give value to is your own pocket. There are Google Quality experts that are likely watching your sites like a hawk, my guess is that your re-inclusion requests will never be granted. I don't mean to be so blunt.. actually I do. What value are your bringing to the users?
-
Yeah I would like to see any pointers towards #2, my competition is using these black hat techniques and we would like to see the best way to compete with them!
-
I would also start by identifying the sites that brought the most traffic or money, I presume you running Adsense on these sites (side note also remember to have less ads above the fold since last layout update from Google).
The sites that are performing the best would be the ones I start with first when it comes to redesigning and making sure there is no further inter linking to the other sites.
Also make sure they are different CLASS C IP's.
-
One question, are you using seo hosting also or they are all hosted in the same IP also?
-
Friend, dont spin or what ever, Create unique and useful content. Pandas crunch you...
-
Guilty!... and it takes one to know one.
You may have little chance of redeeming those sites. 301 redirects seem to wipe the crap clean. So you get the traffic and some of the link juice.
Or...
Pass the website on. Give it to someone else... Change the registration.
..... As an added note, I once heard Matt Cutts say, "Why would you want to buy one of these (burned) sites? Why wouldn't you start over with a new site? Then at least you are at ground zero instead of in a hole."
-
Google beat up real estate sites for the same reason. City + homes for sale type sites where all that changed was the city, especially where the sites would create mini fact pages for each city to try and pull rank for that area keyword search.
Too many sites have used this technique and the market is saturated with such low quality sites. Niche directories focusing on one city are better, but you need to watch the boiler plate content and also the listings you have need to contribute more to the conversation that just name, address telephone like the yellow pages.
I doubt that Google is penalizing you for using the sale template, as mentioned so many sites use WordPress themes where the code structure is identical, and they are doing fine. I am curious if you are interlinking the websites.
Good content, usefulness and fresh contributions to the site will always be a good recipe. Fix and request re-inclusion and your shouldn't have a problem.
-
Hi Ilya,
It's been a few months since your original question. Are you still looking for some advice regarding this? Or do you have something you could share with us about what you did and let us know what we should do or not do if we're in a similar situation?
Thanks!
-
Are all of the domains being hosted under the same IP or have the same WHOIS info? If so then I suggest you use different hosting providers (hence difference IP's) and change up the whois info for each of the domains (you can even use private).
-
IMHO this is absolutely a content issue - opposed to spinning the content my suggestion would be to spent a little bit of $$ on having unique articles written targeting your top-level head and body keywords on each domain. You can get unique articles created for as little as $3 each using Odesk, Guru, or similar services - at this point I think spinning the content is not a great approach as Google has already recorded that these sites contain duplicate content and have been flagged as such, spent the money on some real content, re-work your post and page titles to reflect the new unique content, re-generate your sitemaps to include the new unique titles, and then re-submit
-
it is time for you to spin your content and try and make the sites distinct from each other. Try it then ping the sites. If nothing changes after a month you will have to write a love letter to google.
-
No, I haven't. Thanks for sharing, David! I guess I'm doing it first thing tomorrow.
-
You may have already seen this article from a few weeks back, but it is closely related to what you are thinking about doing with the domains:
http://searchenginewatch.com/article/2086693/Exact-Match-Domains-Can-Double-Clicks-on-PPC-Ads
Good luck!
-
A while ago a business manager I work with purchased 30 exact match domains, willing to use them for "SEO purposes". I actually poked around for ideas here too. At the end I decided I can't use them at all and got them redirected. The advice I got here was to focus my effort on the main site. What I plan to use them for at this point is AdWords. I will try to boost the CTR. If that does nto work, I'm 301 tem to the main site again.
My plan is to grab the domains, put a canonical link element on the whole domain, since it will duplicate the main site. Since AdWords URLs tend to get pasted into forum threads, blog post and social networks, I'm kind of hoping for a collateral effect in terms of backlinks.
Even if you have your sites reincluded, there is no guarantee that they won't go down again pretty soon. This seems a short term strategy anyway. Get the good URLs redirected to you main site and focus your effort there.
-
i would consider adding social features such as seo-able comments/forums/disqus etc. to help build each sites "local"ness in kw's. get each site as far away from the "template" mindset as you can.
in terms of "templates" keep in mind that google is not looking at template code for the most part. they're looking at the content (stuff in text) on your site and it's relevance.
i'd also consider shelving 260 of those domains, and making 10 really great sites.
best of luck!
-
Recommend the 2 points you mentioned. Of course, followed with a dose of good old backlinks and backed by some social signals as well. Yes its more work but it will you see better ROI in the long run.
Also, echoing other users here, template is not an issue at all - its what you put on it - the content, titles and other meta tags that needs to be unique especially if you are a low-mid level authority. Very high level authority sites can get away with duplicate content as Google sometimes tends to blindly trust them over the original source.
-
They have been attacked time and time again - its not happened yet, and MC keeps saying they are looking at it. Quite a few took a bump when I put that comment out, but weeks later most were back in the index.
-
Rishi -
Can you recommend a good resource that supports point number 2? I'd love to learn more about what they are actually doing to downgrade exact match domains.
Thanks
-
Google hates duplicate content, so much so that you will have problems with two urls, let alone 270 of them. Don't bother trying anything funny, like spinning the articles, at least for now. When you submit your site for re-inclusion it will be examined by humans, not robots. You need to have a good story and a nice clean site if you want to get back into the game. Being that your sites are 98 percent duplicate content, that really means you have only one, maybe two, websites at best. I would make 301 redirects from all your other websites to your best performing website. This will show google that you have solved your duplicate content problems while keeping all the links from the other sites. Good luck! dmac
-
There's no need to worry about the templates... But your content sounds 99% dupe across all of your 270+ sites. I'd start with rewriting the pages for your best converting sites and take it one by one from there. Not sure if this is in fact a manual penalty, are your URLs still apearing in the SERPS when you search for them?
-
Sorry - I had to giggle when I first read your question
1. Google has made changes to algo that is finding a number of these sites , especially ones with dupe content
2. They are in the process of downgrading the value of exact match domains.
My advice? If you are adamant n getting these sites back up, take one - customise the template, make sure its not linked to or or linking to any of your others - rewrite the content and make it unique, and then send in a reinclusion request to google, outlining what you did and WHY you did it...
-
Sorry if i didn't explain
the sites are all service business-related. THey're not ad-sense in any way. They all do haev 100% identical content except for some keyword replacement where the city keyword is. Each site is related to a different city, for example plumber + los angeles, or plumber + new york. Each site has a different phone number, so I AM servicing those areas, however my problem is the duplicate content. The link profiles are very different.
-
Okay, first of all, I don't think penaltyis caused by the duplicate templates (just think of all the wordpress sites using the same template!!)
Possible reasons for the penalty:
- Duplicate content (spin articles much?)
- Low quality links (link farms? paid links?)
The reasons above might all contributes to the reason why your sites are penalised.
From the sound of things, it seems like you are doing some black hat stuff here, milking adsense and not providing unique and useful content to users. I am afraid this would not be a quick fix, it's hard to justify how your 270+ domains will become relevant and helpful resources overnight!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain still being crawled despite 301s to new domain
Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>
Technical SEO | | fernandoRiveraZ1 -
Canonicalization help
Hi Moz Community, If I have two different sub-category pages: http://www.example.com/rings/anniversary-rings/
Technical SEO | | IceIcebaby
http://www.example.com/wedding/anniversary-rings/ And the first one is ranking for all KWs, should I add a rel=canonical to the second URL or leave it since it's slightly different? Or should I try and create different unique content for the second URL? Everything in terms of content is the same on both these pages except for the URLs, which aren't that different to begin with. Thanks for your help! -Reed0 -
Title tag not showing on google? Please Help!
I've read the FAQs and searched the help center. My URL is: http://www.webygeeks.comI have updated title tags of my client's website 10-15 days ago, still the title on google is coming as the company name 😞 Why so??Description is correct but title is incorrect, can you please recommend me something guys?Also, i am wondering why the google cache is showing date of september 5 and we have changed the titles around 10 - 15 days before that http://webcache.googleusercontent.com/search?q=cache:P45GOiHRaIUJ:www.webygeeks.com/+&cd=1&hl=en&ct=clnk Really appreciate your suggestion.
Technical SEO | | lvp11380 -
Google indexing tags help
Hey everyone, So yesterday someone pointed out to me that Google is indexing tags and that will likely hurt search engine results. I just did a "site:thetechblock.com" and I notice that tags are still being pulled. http://d.pr/i/WmE6 Today, I went into my Yoast settings and checked "noindex,follow" tags in the Taxomomies settings. I just want to make sure what I'm doing is right. http://d.pr/i/zmbd Thanks guys
Technical SEO | | ttb0 -
Keyword in Domain or not?
My on page optimization grade is an "A" with the following factors; Factor Overview <dl class="scoreboard clearfix"> <dt>Critical Factors</dt> <dd>4 / 4</dd> <dt>High Importance Factors</dt> <dd>7 / 7</dd> <dt>Moderate Importance Factors</dt> <dd>8 / 9</dd> <dt>Low Importance Factors</dt> <dd>11 / 11</dd> <dt>Optional Factors</dt> <dd>5 / 5</dd> </dl> The main thing I appear to be missing is keywords in my URL. How truly important is that in today's SEO world and how much time or ranking would be lost if I do not have control to change the external links to my website if I decided to migrate to a keyword relevant url?
Technical SEO | | classa0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Please help....
Hi Guys! Ok a bit of a funny one here which is causing a confusion between us and a web designer and I was wondering if anyone on here might be able to help. Just a bit of back ground for you, the website has been built on Concrete 5 and when we tried to building a sitemap we found over 110,000 pages. When we spoke to the web designer they have told us that within Google webmaster tools, Google has only indexed 58. But.... (and this is where things get a little confusing, so bare with me.) I thought that cant be right so into the Google search bar I put in site:www.sitename.co.uk and had 217 results appear. So google cant have just 58 pages indexed, right? So after speaking to the designer he then posted on the Concrete 5 help forum, to try and help figure it out. I have posted his exact forum post below that the web designer has asked: I'm having some issues where a site we are working on seems to be making multiple pages going to the same page. An SEO specialist has run a report and found a number of duplicate pages created by C5. We are concerned that this is going to dilute or worse penalise the way google sees the site. http://www.sitename.co.uk/
Technical SEO | | NoisyLittleMonkey
[http://www.sitename.co.uk/index.php?cID=?akID[155]atSelectOptionID...
[http://www.sitename.co.uk/index.php?cID=?akID[155]atSelectOptionID...
[http://www.sitename.co.uk/index.php?cID=?akID[155]atSelectOptionID... Is there a way of stopping google from accessing these duplicate 'cID' pages and stop them being made? Also is there a way of getting rid of the ones that are there? We've done a number of sites in C5 and are beginning to get concerned about this... So I guess my question is: If I can access the same content via 4-5 different cID's is that classed as duplicate content? Thanks in advance guys, and any help would greatly appreciated. 🙂0 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0