Massive duplicate content should it all be rewritten?
-
Ok I am asking this question to hopefully confirm my conclusion.
I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit.
I was looking for duplicate content internally, and they are doing fine there, then my search turned externally......
I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page.
My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content.
I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts?
Thanks in advance!
-
That's right you posted that about link research tools in my other question but I haven't checked them out yet I will do that asap. I definitely have some more investigation to do but I still think that having a massive portion of their site as duplicate content is hurting. I will talk to them about adding content and see where that goes.
-
It can be a tough call. I would start with adding the content. Adding is probably better than removing right now. The links should probably be investigated further as well. Link Research Tools is my favorite, but it is expensive.
-
Yes I used semrush and raven as well as ose. I looked at the directories and any titles that caught my eye. I need to spend more time on Backlinks for the site I am auditing for sure though.
A question I asked elsewhere was how concerned I should be with high amounts of directory links. This one has quite a few but another site I am working on has about 60% of their Backlinks from yellowpage directories. I still don't know what I think about that.
Ya I was thinking they should add some more locally targeted content. The duplicate content has no local keywords in it. It doesn't mention their city at all. Like I said that is nearly the largest portion of content on their site and has no local terms.
-
Did you check the domains? The numbers alone might not seem spammy, but there are domains with high authority that have been causing Penguin problems. A lot of directory links, any domain with Article in the title, things of that sort. I would try using Majestic and SEMRush for a comparison.
Even with that information, I am not convinced that the duplicate content is enough. I would test it by adding 200-300 words of unique copy above the duplicate content on the pages to see if helps the rankings at all. That will be more cost effective than completely rewriting content first.
-
So link metrics from OSE are that the site I am auditing has 69 referring domains with 1199 links a couple hundred are directories. There does not seem to be any spammy referring domains for either site after a quick once through. The competitor has 10 referring domains with 77 links. The average DA of the referring domains for the competitor is about half of the site I am auditing. The competitors anchor text is slightly better for the keywords in question on average. All in all though the link portfolios are not what is beating the site I am auditing.
-
That makes sense
-
No its a totally regional industry they aren't competitors and they have exclusivity in their contracts so they can't work with competitors inside a certain radius or whatever.
I didn't mean they should be ranking nationally I am just saying it is possible in regards to your question of is local or national seo more important.
-
What? That is a little crazy. I don't think I could work for two companies trying to rank for the same keywords, that is such a conflict of interest.
Each site is an individual, and there are over 200 ranking factors. So it isn't really fair to say that they should have the same results. The sites are different and probably have enough differences to make ranking them each a challenge, especially on the same key terms.
-
Yes they are a local service company serving St. Louis. However I will say that the marketing company they hired have a client in the same field in New England that ranks top 5 for the same keywords nationally so to me there is no reason they shouldn't be able to do the same.
-
I totally agree that it needs to be rewritten. Is local SEO more important than ranking nationally?
-
Ya you are totally right I have to dig into the Backlinks. I will post the results back here when I get it done.
The results are local results so that is why the site with the original content doesn't rank but the duplicate does. The original content belongs to a company half of the US away. Neither company ranks for the search terms on a national scale but when I paste content in directly to Google and search, the original content does beat out the site I am auditing.
-
I think you are right in your assumption. Duplicate content is never a good thing. However, if it isn't the same content on the site that is outranking them, then Google must be seeing the site you are auditing as more authoritative than the site they copied the content from. So, while it is an issue, the links might prove to show you where the actual optimization needs to be. If things are neck in neck, like I am understanding, then then link profile is going to be extremely important.
The content, no doubt, should be rewritten. Without a look at the link profile though, you can't say it is the reason they aren't outranking the guys in the number one spot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about partial duplicate content on location landing pages of multilocation business
Hi everyone, I am a psychologist in private practice in Colorado and I recently went from one location to 2 locations. I'm currently updating my website to better accommodate the second location. I also plan continued expansion in the future, so there will be more and more locations as time goes on. As a result, I am making my websites current homepage non-location specific and creating location landing pages as I have seen written about in many places. My question is: I know that location landing pages should have unique content, and I have plenty of this, but how much content is it also okay to have be duplicate across the location landing pages and the homepage? For instance, here is the current draft of the new homepage (these are not live yet): http://www.effectivetherapysolutions.com/dev/ And here are the drafts of the location landing pages: http://www.effectivetherapysolutions.com/dev/denver-office http://www.effectivetherapysolutions.com/dev/colorado-springs-office And for reference, here is the current homepage that is actually live for my single Denver location: http://www.effectivetherapysolutions.com/ As you can see, the location landing pages have the following sections of unique content: Therapist picture at the top testimonial quotes (the one on the homepage is the only thing I have I framed in this block from crawl so that it appears as unique content on the Denver page) therapist bios GMB listing driving directions and hours and I also haven't added these yet, but we will also have unique client success stories and appropriately tagged images of the offices So that's plenty of unique content on the pages, but I also have the following sections of content that are identical or nearly identical to what I have on the homepage: Intro paragraph blue and green "adult" and child/teen" boxes under the intro paragraph "our treatment really works" section "types of anxiety we treat" section Is that okay or is that too much duplicate content? The reason I have it that way is that my website has been very successful for years at converting site visitors into paying clients, and I don't want to lose aspects of the page that I know work when people land on it. And now that I am optimizing the location landing pages to be where people end up instead of the homepage, I want them to still see all of that content that I know is effective at conversion. If people on here do think it is too much, one possible solution is to turn parts of it into pictures or put them into I-frames on the location pages so Google doesn't crawl those parts of the location pages, but leave them normal on the homepage so it still gets crawled on there. I've seen a lot written about not having duplicate content on location landing pages for this type of website, but everything I've read seems to refer to entire pages being copied with just the location names changed, which is not what I'm doing, hence my question. Thanks everyone!
Local Website Optimization | | gremmy90 -
Blogs/content marketing or slower salesfunnel on webshop?
Hi all, Im considering about building contents en blogs on a webshop, because a visitor will get see a lot of information about blogs, etc. The salefunnel will be chaotic, purchasing will be slower on a webshop. The webshop has more then 5000 products. Focus on gamers. For example Ikea or mahuranna shop, they have builded a website near their webshops. To get more traffic ofcourse, but its to hard to do both of them. Your focus will get lost and they way of communication on website/shop will be changing. Your brand and strategic will also change a lot, thats why im considering to find the right way. Who can give me an advice?
Local Website Optimization | | Dreamgame20160 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Multi-Country Multi-Language content website
Hi Community! I'm starting a website that is going to have content from various countries and in several languages. What is the best URL structure in this case? I was thinking of doing something like: english name of the plant, content in english, content for USA:
Local Website Optimization | | phiber
www.flowerpedia.com/flowers/red-roses spanish name of the plant, content in spanish, content for MX:
mx.flowerpedia.com/es/rosas/rosas-rojas english name of the plant, content in english, content for MX:
mx.flowerpedia.com/roses/red-roses
this content is not the same as flowerpedia/flowers/red-roses Content for Mexico would not exist in languages other than english and spanish. So for example:
mx.flowerpedia.com/jp/flowers/red-roses would not exist and it would redirect
to the english version:
mx.flowerpedia.com/flowers/red-roses What would be the best URL structure in this case?0 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Should digital marketing agencies treat SEO differently when it comes to homepage content?
When I review competitor digital agency sites, they seem to have very little homepage content. But how would this be beneficial in gaining a higher SERP rank?
Local Website Optimization | | randomagency1 -
Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
Hello All, I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content. All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well. We have been going through our categories and writing unique content for our most popular locations to help rank on local search. Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page. I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?.. I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword. An Alternative I thought, may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?... What do you think ?. It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages Any thoughts on this greatly appreciated ? thanks Pete
Local Website Optimization | | PeteC120 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120