Will I have duplicate content on my own website?
-
Hello Moz community, We are an agency providing services to various industries, and among them the hair salon industry. On our website, we have our different service pages in the main menu, as usual. These service pages are general information and apply to any industry.We also have a page on the website that is only intended for the hair salon industry. On this page, we would like to link new service pages: they will be the same services as our “general” services, but specialized for hair salons. My questions relate to duplicate content:
- Do we have to make the new individual service pages for hair salons with completely different text, even though it’s the same service, in order to avoid having duplicate content?
- Can we just change a few words from the “general service” page to specifically target hair salons, and somehow avoid Google seeing it as duplicate content?
Reminder that these pages will be internal links inside of the hair salon industry page. Thank you in advance for your answers, Gaël
-
Gael,
The rule is pretty simple here- there's really no such thing as a "duplicate content penalty" that will hurt your entire site. The question is--do you want those pages with content similar to the original to rank equally to the original page? If so, each page will have to be very unique and build its own expertise, authority and trust (quality content, inbound links, local seo).
While there may not be a penalty for content duplicated on your site, google does have a threshold as to what it calls original content and just changing a few words on the page will not meet those requirements. You really should start from scratch and rethink the copy on each page if you're thinking you want these pages to rank well in the future.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content from page links
So for the last month or so I have been going through fixing SEO content issues on our site. One of the biggest issues has been duplicate content with WHMCS. Some have been easy and other have been a nightmare trying to fix. Some of the duplicate content has been the login page when a page requires a login. For example knowledge base article that are only viewable by clients etc. Easily fixed for me as I dont really need them locked down like that. However pages like affiliate.php and pwreset.php that are only linked off of a page. I am unsure how to take care of these types. Here are some pages that are being listed as duplicate: Should this type of stuff be a 301 redirect to cart.php or would that break something. I am guessing that everything should point back to cart.php.
On-Page Optimization | | blueray
https://www.bluerayconcepts.com/brcl...art.php?a=view
https://www.bluerayconcepts.com/brcl...php?a=checkout These are the ones that are really weird to me. These are showing as duplicate content but pwreset is only a link of the KB category. It shows up as duplicate many times as does affilliate.php: https://www.bluerayconcepts.com/brcl...ebase/16/Email
https://www.bluerayconcepts.com/brcl...16/pwreset.php Any help is overly welcome.0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Duplicate content - Opencart
In my last report I have a lot of duplicate content. Duplicate pages are: http://mysite.com/product/search&filter_tag=Сваров�% http://mysite.com/product/search&filter_tag=бижу http://mysite.com/product/search&filter_tag=бижузо�%8 And a lot of more, starting with -- http://mysite.com/product/search&filter_tag= Any ideas? Maybe I should do something in robots.txt, but please tell me the exact code. Best Regards, Emil
On-Page Optimization | | famozni0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Silo and content
I'm about to launch my site but I have a question regarding content and silo structure. If I don't have enough content to fill 4 subpages, could it be better to have only a content-keyword-rich landing page for a silo instead of multiple pages with poor content? Thank you!
On-Page Optimization | | mediodigital0 -
Duplicate content http:// something .com and http:// something .com/
Hi, I've just got a crawl report for a new wordpress blog with suffusion theme and yoast wordpress seo module and there is duplicate content for: http:// something .com and http:// something .com/ I just can't figure out how to handle this. Can I add a redirect for .com/ to .com in htaccess? Any help is appreciated! By the way, the tag value for rel canonical is **http:// something .com/ **for both.
On-Page Optimization | | DanielSndstrm0 -
Number of characters to duplicate content
I wonder how much characters in a page title so it can be characterized for Googleas duplicate content?
On-Page Optimization | | imoveiscamposdojordao
Sorry for the English, I used Google Translator.
I'm from Brazil 😄
Thanks.0