Massive duplicate content should it all be rewritten?
-
Ok I am asking this question to hopefully confirm my conclusion.
I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit.
I was looking for duplicate content internally, and they are doing fine there, then my search turned externally......
I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page.
My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content.
I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts?
Thanks in advance!
-
That's right you posted that about link research tools in my other question but I haven't checked them out yet I will do that asap. I definitely have some more investigation to do but I still think that having a massive portion of their site as duplicate content is hurting. I will talk to them about adding content and see where that goes.
-
It can be a tough call. I would start with adding the content. Adding is probably better than removing right now. The links should probably be investigated further as well. Link Research Tools is my favorite, but it is expensive.
-
Yes I used semrush and raven as well as ose. I looked at the directories and any titles that caught my eye. I need to spend more time on Backlinks for the site I am auditing for sure though.
A question I asked elsewhere was how concerned I should be with high amounts of directory links. This one has quite a few but another site I am working on has about 60% of their Backlinks from yellowpage directories. I still don't know what I think about that.
Ya I was thinking they should add some more locally targeted content. The duplicate content has no local keywords in it. It doesn't mention their city at all. Like I said that is nearly the largest portion of content on their site and has no local terms.
-
Did you check the domains? The numbers alone might not seem spammy, but there are domains with high authority that have been causing Penguin problems. A lot of directory links, any domain with Article in the title, things of that sort. I would try using Majestic and SEMRush for a comparison.
Even with that information, I am not convinced that the duplicate content is enough. I would test it by adding 200-300 words of unique copy above the duplicate content on the pages to see if helps the rankings at all. That will be more cost effective than completely rewriting content first.
-
So link metrics from OSE are that the site I am auditing has 69 referring domains with 1199 links a couple hundred are directories. There does not seem to be any spammy referring domains for either site after a quick once through. The competitor has 10 referring domains with 77 links. The average DA of the referring domains for the competitor is about half of the site I am auditing. The competitors anchor text is slightly better for the keywords in question on average. All in all though the link portfolios are not what is beating the site I am auditing.
-
That makes sense
-
No its a totally regional industry they aren't competitors and they have exclusivity in their contracts so they can't work with competitors inside a certain radius or whatever.
I didn't mean they should be ranking nationally I am just saying it is possible in regards to your question of is local or national seo more important.
-
What? That is a little crazy. I don't think I could work for two companies trying to rank for the same keywords, that is such a conflict of interest.
Each site is an individual, and there are over 200 ranking factors. So it isn't really fair to say that they should have the same results. The sites are different and probably have enough differences to make ranking them each a challenge, especially on the same key terms.
-
Yes they are a local service company serving St. Louis. However I will say that the marketing company they hired have a client in the same field in New England that ranks top 5 for the same keywords nationally so to me there is no reason they shouldn't be able to do the same.
-
I totally agree that it needs to be rewritten. Is local SEO more important than ranking nationally?
-
Ya you are totally right I have to dig into the Backlinks. I will post the results back here when I get it done.
The results are local results so that is why the site with the original content doesn't rank but the duplicate does. The original content belongs to a company half of the US away. Neither company ranks for the search terms on a national scale but when I paste content in directly to Google and search, the original content does beat out the site I am auditing.
-
I think you are right in your assumption. Duplicate content is never a good thing. However, if it isn't the same content on the site that is outranking them, then Google must be seeing the site you are auditing as more authoritative than the site they copied the content from. So, while it is an issue, the links might prove to show you where the actual optimization needs to be. If things are neck in neck, like I am understanding, then then link profile is going to be extremely important.
The content, no doubt, should be rewritten. Without a look at the link profile though, you can't say it is the reason they aren't outranking the guys in the number one spot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
No Index, No Follow Short *but relevant) content?
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help. We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words). I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
What is a good "white hat" content distribution network for link building?
I am helping a client with Local SEO efforts who has hundreds of blog posts (they have been doing 5 a week for the last 3 years) that contain full length articles about their industry. The client's website itself has been very well optimized for all regards (CRO, Mobile, download speed, citations). However they have very weak domain authority compared to their competitors. I am looking for a bona fide content distribution network I could use to promote my client's blog posts/articles. I have used Linkvana in the past but I have become wary of them after the penguin update. I also had functionality problems using their interface. Are their any bona fide content/article distribution networks out there? Thanks
Local Website Optimization | | RosemaryB0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Do duplicate street addresses on 2 website affect SEO?
Hi, We have 2 websites built for one client that has 2 companies running from the same physical location. Would having the same address listed on both websites affect their SEO rankings? The 2 websites mentioned are linked below: http://anastasiablinds.ca/ http://www.greenfoxwindows.ca/ Thanks for your help!
Local Website Optimization | | Web3Marketing871 -
Content spinning or duplicate content — a potential penalty or a safe technique?
Currently I’m working on the local UK business website www.londonlocksmith.london and I have to say a few practises of the competition got me confused. For example websites like these:
Local Website Optimization | | PayPro
http://lambeth-trusted-local-locksmith.co.uk/
http://clapham-trusted-local-locksmith.co.uk/
http://streathamhill-trusted-local-locksmith.co.uk/
http://hernehillse24-trustedlocallocksmith.co.uk/ All of them rank decent for the main regional keyword (e.g. Lambeth locksmith) and have an ok-ish DA. But as you scroll through these websites you see that the content is the same for all of them except for the location name, plus they all link to each other (see the footer). Now my question is: can this be a good technique for higher local ranking by creating dedicated websites (not just landing pages) with the target keyword in the domain name? And also: what is your experience with such ways of keyword targeting; what do you think in general about content spinning for local services with high competition?; what are your suggestions?0 -
How do I set up 2 businesses that work together but are ran seperately with two separate websites but similar content?
How do I set up these sites so that they will not be negatively affecting their SEO efforts? I have 2 businesses with the same owner. Business A manufactures nurse call systems and Business B installs them. They are run separately with two websites. The content is very similar because the business that installs them describes the different products on their website. These are the two sites: intercallsystems.com and nursecallny.com , My thought was on nursecallny.com when you click on the nav link "Nurse Call Systems" you would be directed to the intercell website. Would this be the best method? Thank you for your help!
Local Website Optimization | | renalynd270 -
Multi-Country Multi-Language content website
Hi Community! I'm starting a website that is going to have content from various countries and in several languages. What is the best URL structure in this case? I was thinking of doing something like: english name of the plant, content in english, content for USA:
Local Website Optimization | | phiber
www.flowerpedia.com/flowers/red-roses spanish name of the plant, content in spanish, content for MX:
mx.flowerpedia.com/es/rosas/rosas-rojas english name of the plant, content in english, content for MX:
mx.flowerpedia.com/roses/red-roses
this content is not the same as flowerpedia/flowers/red-roses Content for Mexico would not exist in languages other than english and spanish. So for example:
mx.flowerpedia.com/jp/flowers/red-roses would not exist and it would redirect
to the english version:
mx.flowerpedia.com/flowers/red-roses What would be the best URL structure in this case?0