Duplicate content
-
I have to sentences that I want to optimize to different pages for.
sentence number one is
travel to ibiza by boat
sentence number to is
travel to ibiza by ferry
My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
-
yes, I agree with this!
-
you cannot use the same copy to replicate it for some other topic by replacing the words
-
Yes, I agree with ameliavargo, you could make just a single page including both keywords and potentially rank for both phrases. Boat and ferry are really similar, and I am not sure if you can really make two different pages differently relevant enough to rank both well. That could actually fireback.
I would definitively opt for a single optimized page which includes both words.
-
I may be missing the point here, but why do you need two pages at all? Google understands synonyms so surely as boat / ferry are essentially the same thing, you could probably get away with just writing one... Especially as the message would be the same for both topics.
If you feel you need two pages, then definitely make them unique!
As for URLs - if you can, make sure your keyword is in the url.
Hope this helps
-
I don't know how much control do you have on your url structure but you can always create:
-domain.com/toibiza/ --> tips on how to travel to ibiza
-domain.com/toibiza/ferry/ --> create a content focused on advantages on going by ferry to ibiza (and link to book now if you have)
-domain.com/toibiza/boat --> create content focused on advantages on going by boat (and link to book now if you have)
You can link to both boat and ferry from ibiza page, an to each other using them as alternatives.
Creating 3 different text of 200 words won't be high cost with a freelancer, though you can always try to mash up some content elsewhere.
-
Why not use - www.domain.com/your-key-word.html
-
Any difference or are you actually on the same thing once you hop on a boat/ferry?
-
Thank you, but can the url be similar?
like this www.domain.com/ferrytoibiza and www.domain.com/boattoibizza
And then make unique content on both pages?
-
Definitely invest in two unique copies of text. If there are thousands and you are unable to get them all done at once you may try variable driven text, however replacing one element and expecting Google to like it is a bit too much.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
Responsive Code Creating Duplicate Content Issue
Good morning, Our developers have recently created a new site for our agency. The site is responsive for mobile/tablets. I've just put the site through Screaming Frog and I've been informed of duplicate H2s. When I've looked at some of the page sources, there are some instances of duplicated H2s and duplicated content. These duplicates don't actually appear on the site, only in the code. When I asked the development guys about this, they advised this is duplicated because of the code for the responsive site. Will the site be negatively affected because of this? Not everything is duplicated, which leads me to believe it probably could have been designed better... but I'm no developer so don't know for sure. I've checked the code for other responsive sites and no duplicates can be found. Thanks in advance, Lewis
Technical SEO | | PeaSoupDigital0 -
Duplicate Content with ADN, DNS and F5 URLs
In my duplicate content report, there are URLs showing as duplicate content. All of the pages work, they do not redirect, and they are used for either IT debugging or as part of a legacy system using a split DNS, QAing the site, etc... They aren't linked (or at least, shouldn't be) on any pages, and I am not seeing them in Search Results, but Moz is picking them up. Should I be worried about duplicate content here and how should I handle them? They are replicates of the current live site, but have different subdomains. We are doing clean up before migrating to a new CMS, so I'm not sure it's worth fixing at this point, or if it is even an issue at all. But should I make sure they are in robots or take any action to address these? Thanks!
Technical SEO | | QAD_ERP0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0 -
Duplicate Content and Canonical use
We have a pagination issue, which the developers seem reluctant (or incapable) to fix whereby we have 3 of the same page (slightly differing URLs) coming up in different pages in the archived article index. The indexing convention was very poorly thought up by the developers and has left us with the same article on, for example, page 1, 2 and 3 of the article index, hence the duplications. Is this a clear cut case of using a canonical tag? Quite concerned this is going to have a negative impact on ranking, of course. Cheers Martin
Technical SEO | | Martin_S0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0