Massive duplicate content should it all be rewritten?
-
Ok I am asking this question to hopefully confirm my conclusion.
I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit.
I was looking for duplicate content internally, and they are doing fine there, then my search turned externally......
I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page.
My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content.
I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts?
Thanks in advance!
-
That's right you posted that about link research tools in my other question but I haven't checked them out yet I will do that asap. I definitely have some more investigation to do but I still think that having a massive portion of their site as duplicate content is hurting. I will talk to them about adding content and see where that goes.
-
It can be a tough call. I would start with adding the content. Adding is probably better than removing right now. The links should probably be investigated further as well. Link Research Tools is my favorite, but it is expensive.
-
Yes I used semrush and raven as well as ose. I looked at the directories and any titles that caught my eye. I need to spend more time on Backlinks for the site I am auditing for sure though.
A question I asked elsewhere was how concerned I should be with high amounts of directory links. This one has quite a few but another site I am working on has about 60% of their Backlinks from yellowpage directories. I still don't know what I think about that.
Ya I was thinking they should add some more locally targeted content. The duplicate content has no local keywords in it. It doesn't mention their city at all. Like I said that is nearly the largest portion of content on their site and has no local terms.
-
Did you check the domains? The numbers alone might not seem spammy, but there are domains with high authority that have been causing Penguin problems. A lot of directory links, any domain with Article in the title, things of that sort. I would try using Majestic and SEMRush for a comparison.
Even with that information, I am not convinced that the duplicate content is enough. I would test it by adding 200-300 words of unique copy above the duplicate content on the pages to see if helps the rankings at all. That will be more cost effective than completely rewriting content first.
-
So link metrics from OSE are that the site I am auditing has 69 referring domains with 1199 links a couple hundred are directories. There does not seem to be any spammy referring domains for either site after a quick once through. The competitor has 10 referring domains with 77 links. The average DA of the referring domains for the competitor is about half of the site I am auditing. The competitors anchor text is slightly better for the keywords in question on average. All in all though the link portfolios are not what is beating the site I am auditing.
-
That makes sense
-
No its a totally regional industry they aren't competitors and they have exclusivity in their contracts so they can't work with competitors inside a certain radius or whatever.
I didn't mean they should be ranking nationally I am just saying it is possible in regards to your question of is local or national seo more important.
-
What? That is a little crazy. I don't think I could work for two companies trying to rank for the same keywords, that is such a conflict of interest.
Each site is an individual, and there are over 200 ranking factors. So it isn't really fair to say that they should have the same results. The sites are different and probably have enough differences to make ranking them each a challenge, especially on the same key terms.
-
Yes they are a local service company serving St. Louis. However I will say that the marketing company they hired have a client in the same field in New England that ranks top 5 for the same keywords nationally so to me there is no reason they shouldn't be able to do the same.
-
I totally agree that it needs to be rewritten. Is local SEO more important than ranking nationally?
-
Ya you are totally right I have to dig into the Backlinks. I will post the results back here when I get it done.
The results are local results so that is why the site with the original content doesn't rank but the duplicate does. The original content belongs to a company half of the US away. Neither company ranks for the search terms on a national scale but when I paste content in directly to Google and search, the original content does beat out the site I am auditing.
-
I think you are right in your assumption. Duplicate content is never a good thing. However, if it isn't the same content on the site that is outranking them, then Google must be seeing the site you are auditing as more authoritative than the site they copied the content from. So, while it is an issue, the links might prove to show you where the actual optimization needs to be. If things are neck in neck, like I am understanding, then then link profile is going to be extremely important.
The content, no doubt, should be rewritten. Without a look at the link profile though, you can't say it is the reason they aren't outranking the guys in the number one spot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta descriptions in other languages than the page's content?
Hi guys, I need an opinion on the optimization of meta descriptions for a website available in 6 languages that faces the following situation: Main pages are translated in 6 languages, English being primary >> all clear here. BUT The News section includes articles only in English, that are displayed as such on all other language versions of the website. Example:
Local Website Optimization | | Andreea-M
website.com/en/news/article 1
website.com/de/neues/article 1
website.com/fr/nouvelles/article 1
etc. Because we don't have the budget right now to translate all content, I was wondering if I could add only the Meta Titles and Meta Descriptions in the specific languages (using Google Translate), while the content to remain in English. Would this be accepted as reasonable enough for Google, or would it affect the website ranking?
I'd like to avoid major mistakes, so I'm hoping someone here on this forum has a better idea of how to proceed in this case.0 -
Add Content to Page or Create New Page?
We are doing some local SEO for our business which is in 10 cities. We have built a city page with unique content for each city and linked to a unique contact page with contact information unique for each city. The content on our existing page is fairly thin. 2/3 of it is the same amongst all pages as our services are the exact same from city to city so the description ad menu of our services. Then 1/3 of the content is unique to the city which is a stock photo and 1-2 paragraphs of text containing about 175 words. We have another chunk of content for each city which is probably 2-3 paragraphs but each paragraph will be short so probably in total 200 words in 1-3 paragraphs. The subject of the content is related to one of the most popular search queries that are location specific. For example, if we were a company that provided say, environmental remodeling services in city X, this second chunk of content might be about required building permits when doing remodeling in City X and how to get them, how much they cost. If the original content on the pre-existing landing page is already pretty thin, is the SEO effect going to most likely be better to add the content to the existing page or, even though it's less than 200 words, add the content to a separate page and cross link between the main city page and the city contact page.
Local Website Optimization | | SEO18051 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
What is the optimal approach for a new site that has geo-targeted content available via 2 domains?
OK, so I am helping a client with a new site build. It is a lifestyle/news publication that traditionally has focused on delivering content for one region. For ease of explanation, let's pretend the brand/domain is 'people-on-the-coast.com'. Now they are now looking to expand their reach to another region using the domain 'people-in-the-city.com'. Whilst on-the-coast is their current core business and already has some search clout, they are very keen on the city market and the in-the-city domain. They would like to be able to manage the content through one CMS (joomla) and the site will deliver articles and the logo based on the location of the user (city or coast). There will also be cases where the content is duplicated for both regions. The design/layout etc. will all remain identical. So what I am really wanting to know is the pros, cons and ultimately the best approach to handle the setup and ongoing management from an SEO (and UX) perspective. All I see is problems! Any help would be greatly appreciated! Thanks,
Local Website Optimization | | bennyt
Confused O.o0 -
Location Pages and Duplicate Content and Doorway Pages, Oh My!
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services. Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc. They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well. My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names". In a nutshell, Google's Guidelines seem to have a conflict on this topic: Location Pages: "Have each location's or branch's information accessible on separate webpages"
Local Website Optimization | | eyeflow
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one." Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page: Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content." ...starting to feel like I'm in a Google Guidelines Paradox! Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Massive drop in traffic, recently.
We have been focusing on Local SEO and quality, original content and social media. Over the past 4 weeks we have seen sharp declines in our impressions and overall traffic to the site. Crawler does not reveal any new massive issues, index is growing and there are no penalties. Where do we begin to troubleshoot?
Local Website Optimization | | GreenStone0