Massive duplicate content should it all be rewritten?
-
Ok I am asking this question to hopefully confirm my conclusion.
I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit.
I was looking for duplicate content internally, and they are doing fine there, then my search turned externally......
I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page.
My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content.
I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts?
Thanks in advance!
-
That's right you posted that about link research tools in my other question but I haven't checked them out yet I will do that asap. I definitely have some more investigation to do but I still think that having a massive portion of their site as duplicate content is hurting. I will talk to them about adding content and see where that goes.
-
It can be a tough call. I would start with adding the content. Adding is probably better than removing right now. The links should probably be investigated further as well. Link Research Tools is my favorite, but it is expensive.
-
Yes I used semrush and raven as well as ose. I looked at the directories and any titles that caught my eye. I need to spend more time on Backlinks for the site I am auditing for sure though.
A question I asked elsewhere was how concerned I should be with high amounts of directory links. This one has quite a few but another site I am working on has about 60% of their Backlinks from yellowpage directories. I still don't know what I think about that.
Ya I was thinking they should add some more locally targeted content. The duplicate content has no local keywords in it. It doesn't mention their city at all. Like I said that is nearly the largest portion of content on their site and has no local terms.
-
Did you check the domains? The numbers alone might not seem spammy, but there are domains with high authority that have been causing Penguin problems. A lot of directory links, any domain with Article in the title, things of that sort. I would try using Majestic and SEMRush for a comparison.
Even with that information, I am not convinced that the duplicate content is enough. I would test it by adding 200-300 words of unique copy above the duplicate content on the pages to see if helps the rankings at all. That will be more cost effective than completely rewriting content first.
-
So link metrics from OSE are that the site I am auditing has 69 referring domains with 1199 links a couple hundred are directories. There does not seem to be any spammy referring domains for either site after a quick once through. The competitor has 10 referring domains with 77 links. The average DA of the referring domains for the competitor is about half of the site I am auditing. The competitors anchor text is slightly better for the keywords in question on average. All in all though the link portfolios are not what is beating the site I am auditing.
-
That makes sense
-
No its a totally regional industry they aren't competitors and they have exclusivity in their contracts so they can't work with competitors inside a certain radius or whatever.
I didn't mean they should be ranking nationally I am just saying it is possible in regards to your question of is local or national seo more important.
-
What? That is a little crazy. I don't think I could work for two companies trying to rank for the same keywords, that is such a conflict of interest.
Each site is an individual, and there are over 200 ranking factors. So it isn't really fair to say that they should have the same results. The sites are different and probably have enough differences to make ranking them each a challenge, especially on the same key terms.
-
Yes they are a local service company serving St. Louis. However I will say that the marketing company they hired have a client in the same field in New England that ranks top 5 for the same keywords nationally so to me there is no reason they shouldn't be able to do the same.
-
I totally agree that it needs to be rewritten. Is local SEO more important than ranking nationally?
-
Ya you are totally right I have to dig into the Backlinks. I will post the results back here when I get it done.
The results are local results so that is why the site with the original content doesn't rank but the duplicate does. The original content belongs to a company half of the US away. Neither company ranks for the search terms on a national scale but when I paste content in directly to Google and search, the original content does beat out the site I am auditing.
-
I think you are right in your assumption. Duplicate content is never a good thing. However, if it isn't the same content on the site that is outranking them, then Google must be seeing the site you are auditing as more authoritative than the site they copied the content from. So, while it is an issue, the links might prove to show you where the actual optimization needs to be. If things are neck in neck, like I am understanding, then then link profile is going to be extremely important.
The content, no doubt, should be rewritten. Without a look at the link profile though, you can't say it is the reason they aren't outranking the guys in the number one spot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Question about partial duplicate content on location landing pages of multilocation business
Hi everyone, I am a psychologist in private practice in Colorado and I recently went from one location to 2 locations. I'm currently updating my website to better accommodate the second location. I also plan continued expansion in the future, so there will be more and more locations as time goes on. As a result, I am making my websites current homepage non-location specific and creating location landing pages as I have seen written about in many places. My question is: I know that location landing pages should have unique content, and I have plenty of this, but how much content is it also okay to have be duplicate across the location landing pages and the homepage? For instance, here is the current draft of the new homepage (these are not live yet): http://www.effectivetherapysolutions.com/dev/ And here are the drafts of the location landing pages: http://www.effectivetherapysolutions.com/dev/denver-office http://www.effectivetherapysolutions.com/dev/colorado-springs-office And for reference, here is the current homepage that is actually live for my single Denver location: http://www.effectivetherapysolutions.com/ As you can see, the location landing pages have the following sections of unique content: Therapist picture at the top testimonial quotes (the one on the homepage is the only thing I have I framed in this block from crawl so that it appears as unique content on the Denver page) therapist bios GMB listing driving directions and hours and I also haven't added these yet, but we will also have unique client success stories and appropriately tagged images of the offices So that's plenty of unique content on the pages, but I also have the following sections of content that are identical or nearly identical to what I have on the homepage: Intro paragraph blue and green "adult" and child/teen" boxes under the intro paragraph "our treatment really works" section "types of anxiety we treat" section Is that okay or is that too much duplicate content? The reason I have it that way is that my website has been very successful for years at converting site visitors into paying clients, and I don't want to lose aspects of the page that I know work when people land on it. And now that I am optimizing the location landing pages to be where people end up instead of the homepage, I want them to still see all of that content that I know is effective at conversion. If people on here do think it is too much, one possible solution is to turn parts of it into pictures or put them into I-frames on the location pages so Google doesn't crawl those parts of the location pages, but leave them normal on the homepage so it still gets crawled on there. I've seen a lot written about not having duplicate content on location landing pages for this type of website, but everything I've read seems to refer to entire pages being copied with just the location names changed, which is not what I'm doing, hence my question. Thanks everyone!
Local Website Optimization | | gremmy90 -
Does having 2 separate domains with similar content always = duplicate content?
I work for a global company which is in the process of launching their US & European websites, (just re-launched Australian site, migrated from an old domain) all with separate domains with the purpose of localising. However, the US website content will essentially be the same as the Australian one with minor changes (z instead of s, slightly different service offerings etc) but the core information will be the same as the AU site. Will this be seen as duplicate content and Is there a way we can structure this so that the content won’t be seen as duplicate but is still a separate localised website? Thank you.
Local Website Optimization | | PGAUE0 -
Multi-Country Multi-Language content website
Hi Community! I'm starting a website that is going to have content from various countries and in several languages. What is the best URL structure in this case? I was thinking of doing something like: english name of the plant, content in english, content for USA:
Local Website Optimization | | phiber
www.flowerpedia.com/flowers/red-roses spanish name of the plant, content in spanish, content for MX:
mx.flowerpedia.com/es/rosas/rosas-rojas english name of the plant, content in english, content for MX:
mx.flowerpedia.com/roses/red-roses
this content is not the same as flowerpedia/flowers/red-roses Content for Mexico would not exist in languages other than english and spanish. So for example:
mx.flowerpedia.com/jp/flowers/red-roses would not exist and it would redirect
to the english version:
mx.flowerpedia.com/flowers/red-roses What would be the best URL structure in this case?0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0 -
Website and eshop with the same product descrition is duplicate content
Hi there! I'm building a website that is divided in a "marketing" and "shop" sections. The 2 sites are being authored by two companies (my company is doing the marketing one). The marketing site has all the company products while the shop will sell just some of those. I'm facing the problem of duplicated content and want to ask you guys if it will be a problem/mistake to use the same product description (and similar url) for the same product in both sites, and the right way to do it (without rewriting product descriptions). the main site will be : www.companyname.com
Local Website Optimization | | svitol
the shop will be: shop.companyname.com thanks
Francesco0 -
Canonical for 80-90% duplicate content help
Hi . I seem to spend more time asking questions atm. I have a site I have revamped www.themorrisagency.co.uk I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg: http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to : http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc So I am going through this slow but essential process atm. I have a main http://www.themorrisagency.co.uk/band-hire/ page My question is: Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/ Rather than remove them What are your thoughts as I am aware that the damage using a rel= could make it worse. Thanks as always Daniel
Local Website Optimization | | Agentmorris0