Content - Similar but not exactly the same content - Duplicate or Spammy?
-
Hey, so I have been wondering for some time now as some pages will get indexed and others won't appear at all. That makes me think that I am either creating to similar content or it is becoming too spammy.
Take these two pages I created for example. The body content is very similar but h tags, meta tags and title are different. So my questions is; would pages not be displaying due possibly being too similar and spammy or duplicate?
I have linked two pages that are very similar below and would love to hear any thoughts about it.
Any feedback would be greatly appreciated. Thanks in advance.
-
Lots of people grab content and republish it.
Lots of people grab the same content and republish it.
The first few people who do it have the best chance of getting away with it. But, if you are the tenth or the twentith, then you are more likely to be ignored by Google. (After you republish this duplicate, Google might find it, index it, and rank it... then some months down the road they realize that your stuff is duplicate and take action against it.)
The exception to the above is when you are a powerful publisher. Then you can get away with a lot more than other republishers, and you might even outrank the original source.
"If you've realized that your local industry is riddled with poor quality content, see this as your opportunity to beat out lazier competitors. If you deliver the superior experience, it may give you a very valuable edge, while also safeguarding your reputation and rankings against Google filters and penalties in future."
This is so true and so surprising. There are still a lot of topics on the internet that are not covered by substantive, high quality content written by an authoritative author.
-
Thanks for coming back with further questions on this. Unfortunately, changing tags and maps doesn't make your text content "different enough" to lessen the concern that Google may view what you are doing as duplicative.
Basically, your business model is a local business with a single location. If you wish to gain organic visibility for your services beyond that single location, it's true that developing landing pages for these other locations is a best practice ... only provided that the content of them is actually useful and largely unique. You are in a similar scenario to a plumber, who has a single location from which he offers a set of services to a variety of neighboring towns. His Local SEO is going to be anchored to his city of location, but his organic SEO can branch out to represent his work in the other towns he serves. There is nothing spammy about him featuring this work in his service cities, but unless he has something unique to say about his work there, he's going to end up with a weak site burdened with duplicate content clearly designed for search engines instead of for the assistance of consumers.
I recommend taking a look at a blog post I wrote here a couple of years ago that offers tips for creating strong, diversified landing pages in a scenario much like yours:
https://moz.com/blog/overcoming-your-fear-of-local-landing-pages
You will need to dig deep into your resources to create this type of useful, unique content.
As for your competitors, your question is reasonable. If Google doesn't like thin, duplicate content, why do we see people getting away with it? To this, my answer is:
-
People were getting away with all kinds of things the day before an update like Penguin or Panda. They woke up the following day to a changed world in which their lack of effort was no longer being rewarded.
-
If you've realized that your local industry is riddled with poor quality content, see this as your opportunity to beat out lazier competitors. If you deliver the superior experience, it may give you a very valuable edge, while also safeguarding your reputation and rankings against Google filters and penalties in future.
Hope this helps!
-
-
Thank you for the feedback.
I see all the points mentioned but I still feel confused about it. I have worked in the industry for 5 years now and some of the website that do this have been ranking well for at least that long, would google not have penalised these website by now? I started to do it to and alas, I also started to rank better and get more inquiries.
I want to make one point here. Even though the content is similar. It is not the same and I have clearly changed factors that will help google understand what I am trying to achieve. Meta tags, title, h tags, even maps on the pages. Location to location is what I am trying to factor in with this project. Even though the content is very similar. Is it still not unique? Where is the line? that is what I find confusing. Do I spend more time making pages unique even though I don't need to? Will that be better in the long run and keep the pages positioned better for longer? How unique does the content have to be?
-
Seconding the opinions of Clive and EGOL here. And, particularly want to highlight EGOL's point about not imitating competitors' poor practices. It might help to view these competitors as being just one Google action away from getting dinged for this strategy.
It seems like the challenge for you here is to create something that helps customers understand the geography of your services, without simply duplicating the same page and swapping out city names. I would recommend putting some creative resources behind figuring out how to meet this challenge.
-
Clive Morley is right. 100% right.
These pages are close enough to being duplicates and Google will likely filter one of them from the search results. Maybe they will filter both of them from the search results.
If you want to compete for slightly different keywords then you will need to produce unique and substantive content for every one of them.
Many competitors that rank much better do exactly the same.
This is sometimes true. But Google has detected that you are doing it. Someday Google might detect that they are doing it.
So, now it is up to you to stop taking shortcuts and do the work required to present unique value for each page on your website.
-
Hello,
These pages are virtually identical and I wouldn't be surprised if Google views these as duplicate or spammy.
Both of these pages serve the same user intent - moving from Gold Coast to another location - and it's therefore worth considering having just one page that covers all of the numerous destinations.
Both pages have an unnatural amount of keyword usage which may also trigger a spammy issue.
If the purpose is to rank for "moving from Gold Coast to...." type searches than this page would probably benefit from having more Gold Coast related content - geographical info, images, etc.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are press releases that could end up being published with duplicate content links point back to you bad for your site ?
With all the changes to the seo landscape in the resent years im a little unsure as to how a press release work looks in the eyes of Google (and others). For instance, you write up a 500 word press release and it gets featured on the following sites : Forbes Techcrunch BBC CNN NY Times etc ... If each of these cover your story but only rewrite 50% of the article (not saying these sites wouldn't re write the entire artcile, but for this purpose lets presume only 50% is rewritten) could it be negative to your backlink profile, ? Im thinking not, as these sites will have high authority, but what if once your press release is published on these sites 10 other smaller sites re publish the stories with almost no re writing, either straight from the press release or straight from the article in the mainstream news sites. (For clarification this Press release would be done in the fashion of a article suggestion to relevant journalists, rather than a blanket press release, via PR Newswire, mass mail out etc. Although i guess the effect with duplicate content backlinks is the same.) You now have c. 50 articles online all with very similar content with links pointing back at you, would this have a negative effect or would each link just not carry as much value as it normally would. By now we all understand publishing duplicate content on our own sites is a terrible idea, but dose have links pointing back to your self from duplicate (or similar) content hosted on other sites (some being highly authoritative) effect your site 's seo ?
Content Development | | Sam-P1 -
How can we dynamically populate content on our website based on a visitor's web history?
Recently, I have tried looking into options that would allow us to dynamically populate content (specifically images to be used at CTAs on our blog in wordpress) to different users based on their web history on our website. We would want to be able to dynamically populate images based on the number of visits in the past 60 days and the inferred industry based on pages hit. Unfortunately, I have been unable to find anything as a standalone tool - I believe HubSpot may have something like this but it is rolled into their blogging platform. Does anyone know of one?
Content Development | | SMPoulton0 -
Duplicate Content Discovery
I was hit with Penguin on April 24th like a ton of bricks. Luckily my cash cow keyword was kept safe and still is today with even an increase in traffic over the year. With some other main keywords I used to rank far I fell off the board on that day. Since then I have been slowly trying to clean things up as much as I know Today I was sitting down with my coffee and Penguin mindset and I decided to use copyscape again to review duplicate content issues and something I noticed which I either didn't before or didn't think was an issue was my footer. In my footer I used a blurb from some other site in my niche a long time ago. Which I discovered they used from one of the main sites in my niche. Anyways I noticed that my footer is what kept coming up as being duplicate content and was always at an overage of 28% according to copyscape. My question is should I be worried about the footer? Is 28% a lot?
Content Development | | cbielich0 -
Marking our content as original, where the rel=author tag might not be applied
Hello, Can anyone tell, if it is possible to protect text –type content without the rel=author tag? We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased. My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google. The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:) What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10. Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own. So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done. (Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.) Thank you in advance for all of you, sharing your thoughts with me on the topic.
Content Development | | Dilbak0 -
Duplicate page title on blog
Is this something I should b concerned about? I get about 6 posts per page with unique titles - however the title per page is not unique. Is this important?
Content Development | | MartinSpence460 -
Duplicate content on forums?
I am creating a forum. I am concerned that when I create the forum, users will copy content from other places to post onto my forum. How negative/bad is this in terms of google eyes? I am concerned when people copy press releases and re-post it to the forum. Should I make a rule that all content must be typed and not copied? Or is a little copying okay?
Content Development | | sseibel0 -
Blog content practices for e-commerce sites
What is the best practice in regards to content for e-commerce blogs on the same domain as the web-store (blog.storename.com)? What balance of content should be on the blog vs. the item & section pages or doesn't it matter?
Content Development | | MEldridge0 -
How can I use my unique content to my advantage?
Hi, I run http://ablemagazine.co.uk - we also put out a print magazine every 2 months (the biggest disability magazine in Britain) This means we have loads of unique content (around 30 feature stories and 30 news stories every 2 months) Just wondering how I can use this to my advantage? I've been social bookmarking the feature stories (reddit, etc) and a link to all my unique stuff on facebook/twitter. Just wondering if there's anything else I should be doing? Thanks
Content Development | | craven220