Massive duplicate content should it all be rewritten?
-
Ok I am asking this question to hopefully confirm my conclusion.
I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit.
I was looking for duplicate content internally, and they are doing fine there, then my search turned externally......
I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page.
My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content.
I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts?
Thanks in advance!
-
That's right you posted that about link research tools in my other question but I haven't checked them out yet I will do that asap. I definitely have some more investigation to do but I still think that having a massive portion of their site as duplicate content is hurting. I will talk to them about adding content and see where that goes.
-
It can be a tough call. I would start with adding the content. Adding is probably better than removing right now. The links should probably be investigated further as well. Link Research Tools is my favorite, but it is expensive.
-
Yes I used semrush and raven as well as ose. I looked at the directories and any titles that caught my eye. I need to spend more time on Backlinks for the site I am auditing for sure though.
A question I asked elsewhere was how concerned I should be with high amounts of directory links. This one has quite a few but another site I am working on has about 60% of their Backlinks from yellowpage directories. I still don't know what I think about that.
Ya I was thinking they should add some more locally targeted content. The duplicate content has no local keywords in it. It doesn't mention their city at all. Like I said that is nearly the largest portion of content on their site and has no local terms.
-
Did you check the domains? The numbers alone might not seem spammy, but there are domains with high authority that have been causing Penguin problems. A lot of directory links, any domain with Article in the title, things of that sort. I would try using Majestic and SEMRush for a comparison.
Even with that information, I am not convinced that the duplicate content is enough. I would test it by adding 200-300 words of unique copy above the duplicate content on the pages to see if helps the rankings at all. That will be more cost effective than completely rewriting content first.
-
So link metrics from OSE are that the site I am auditing has 69 referring domains with 1199 links a couple hundred are directories. There does not seem to be any spammy referring domains for either site after a quick once through. The competitor has 10 referring domains with 77 links. The average DA of the referring domains for the competitor is about half of the site I am auditing. The competitors anchor text is slightly better for the keywords in question on average. All in all though the link portfolios are not what is beating the site I am auditing.
-
That makes sense
-
No its a totally regional industry they aren't competitors and they have exclusivity in their contracts so they can't work with competitors inside a certain radius or whatever.
I didn't mean they should be ranking nationally I am just saying it is possible in regards to your question of is local or national seo more important.
-
What? That is a little crazy. I don't think I could work for two companies trying to rank for the same keywords, that is such a conflict of interest.
Each site is an individual, and there are over 200 ranking factors. So it isn't really fair to say that they should have the same results. The sites are different and probably have enough differences to make ranking them each a challenge, especially on the same key terms.
-
Yes they are a local service company serving St. Louis. However I will say that the marketing company they hired have a client in the same field in New England that ranks top 5 for the same keywords nationally so to me there is no reason they shouldn't be able to do the same.
-
I totally agree that it needs to be rewritten. Is local SEO more important than ranking nationally?
-
Ya you are totally right I have to dig into the Backlinks. I will post the results back here when I get it done.
The results are local results so that is why the site with the original content doesn't rank but the duplicate does. The original content belongs to a company half of the US away. Neither company ranks for the search terms on a national scale but when I paste content in directly to Google and search, the original content does beat out the site I am auditing.
-
I think you are right in your assumption. Duplicate content is never a good thing. However, if it isn't the same content on the site that is outranking them, then Google must be seeing the site you are auditing as more authoritative than the site they copied the content from. So, while it is an issue, the links might prove to show you where the actual optimization needs to be. If things are neck in neck, like I am understanding, then then link profile is going to be extremely important.
The content, no doubt, should be rewritten. Without a look at the link profile though, you can't say it is the reason they aren't outranking the guys in the number one spot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Migrating to new website with new name and new content
Hi for the past few years I have been running a personal training company from the following domain name www.smpt.me. This has done well in the past and so has some authority in google as it was ranking well on page 1. Over the last 6 months I have set up a new website with some new business partners using the domain name www.healthbyscience.co.uk. This new website, whilst still a personal training website, has different content to the original. We want to use the new website rather than the old one and therefore my question is how I can use the old website to assist with the new website. Thanks
Local Website Optimization | | Health-by-Science0 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Do duplicate street addresses on 2 website affect SEO?
Hi, We have 2 websites built for one client that has 2 companies running from the same physical location. Would having the same address listed on both websites affect their SEO rankings? The 2 websites mentioned are linked below: http://anastasiablinds.ca/ http://www.greenfoxwindows.ca/ Thanks for your help!
Local Website Optimization | | Web3Marketing871 -
How can I migrate a website's content to a new WP theme, delete the old site, and avoid duplication and other issues?
Hey everyone. I recently took on a side project managing a family member's website (www.donaldtlevinemd.com). I don't want to get too into it, but my relative was roped into two shady digital marketing firms that did nothing but a mix of black-hat SEO (and nothing at all). His site currently runs off a custom wordpress theme which is incompatible with important plugins I want to use for local optimization. I'm also unable to implement responsive design for mobile. The silver lining is that these previous "content marketers" did no legitimate link building (I'm auditing the link profile now) so I feel comfortable starting fresh. I'm just not technical enough to understand how to go about migrating his domain to a new theme (or creating a new domain altogether). All advice is appreciated! Thanks for your help!
Local Website Optimization | | jampaper1 -
Implementation advice on fighting international duplicate content
Hi All, Let me start by explaining that I am aware of the rel="canonical" and **rel="alternate" hreflang="x" **tags but I need advice on implementation. The situation is that we have 5 sites with similar content. Out of these 5: 2 use the same URL stucture and have no suffix 2 have a different URL structure with a .html suffix 1 has an entirely different URL structure with a .asp suffix The sites are quite big so it will take a lot of work to go through and add rel="alternate" hreflang="x" tags to every single page (as we know the tag should be applied on a page level not site level). 4 out of the 5 sites are managed by us and have the tag implemented so that makes it easier but the 5th is managed in Asia and we fear the amount of manual work required will put them off implementing it. The site is due to launch at the end of the month and we need to sort this issue out before it goes live so that we are not penalised for duplicate content. Is there an easy way to go about this or is the only way a manual addition? Has anyone had a similar experience? Your advice will be greatly appreciated. Many thanks, Emeka.
Local Website Optimization | | OptiBacUK0 -
Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
My client has offices in various areas of the US, and we are working to have each location/area rank well in their specific geographical location. For example, the client has offices in Chicago, Atlanta, Dallas & St Louis. Would it be best to: Set up the site structure to have an individual page devoted to each location/area so there's unique content relevant to that particular office? This keeps everything under the same, universal domain & would allow us to tailor the content & all SEO components towards Chicago (or other location). ( example.com/chicago-office/ ; example.com/atlanta-office/ ; example.com/dallas-office/ ; etc. ) Set up subdomains for each location/area...using the basically the same content (due to same service, just different location)? But not sure if search engines consider this duplicate content from the same user...thus penalizing us. Furthermore, even if the subdomains are considered different users...what do search engines think of the duplicate content? ( chicago.example.com ; atlanta.example.com ; dallas.example.com ; etc. ) 3) Set up subdomains for each location/area...and draft unique content on each subdomain so search engines don't penalize the subdomains' pages for duplicate content? Does separating the site into subdomains dilute the overall site's quality score? Can anyone provide any thoughts on this subject? Are there any other solutions anyone would suggest?
Local Website Optimization | | SearchParty0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0