Landing page video scripts - duplicate content concerns
-
we are planning to create a series of short (<30 sec) videos for landing pages for our clients PPC campaigns.
Since our clients all offer the same services (except in different geographical regions of the county) - we were planning to use the SAME script ( approx 85 words) with only the clients business name changed.
Our question is : Would these videos be identified as 'duplicate content' - if we are only planning to use the videos on landing pages and only for PPC? -in other words are we in any danger of any kind of consequences from the engines for repeating script text across a series of landing pages featured only at PPC campaigns?
-
Thanks Steve!
In this scenario I would do the following:
- Add a noindex tag to all of these PPC landing pages (will ensure the pages aren't indexed).
- Make sure the URLs aren't in any sitemaps (good practice).
- Once confirmed that they aren't indexed, add a disallow rule in robots.txt (this will preserve crawl budget).
Hope that helps and all the best.
-
#1 - only a few (3-4 x) of each landing page - for PPC purposes and we intend to add no-follow code to indicate we do not want them indexed
#2 - unlikely we will ever want to expand content - if we do we would re-write to ensure each page had unique content
-
Hey Steve,
To answer your question directly about whether or not Google will see these pages as duplicate content—if you use the same text on them all (for the most part), and don't offer anything else unique, then most likely.
I think a more important question to ask is whether or not Google will see these pages as too thin (which could result in some pretty negative results).
Can I ask a few questions to get some more information?
- How many of these pages are you planning to create for your client?
- Do you feel like there could potentially be some SEO benefits to having these pages in the index if there was an effort to improve them? In other words, is there search volume/opportunity around the topics the videos are about from an organic perspective?
-
As I know the problem with duplicate content is related with spammer with hundreds of TDL trying to trick the search engine, in your case with 5 or 10 pages or videos wouldn't be any problems.
https://www.youtube.com/watch?v=Ets7nHOV1Yo
As long as you declare the content and location (markup video) you will not have any problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Duplicate Content in products
Hello Moz Community, New to Moz and looking forward to beginning my journey towards SEO education and improving our clients' sites. Our client's website is a Shopify store. https://spiritsofthewestcoast.com/ Our first Moz reports show 686 duplicate content issues. I will show the first 4 as examples. https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/haida-eagle-teardrop-earrings https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/haida-orca-silver-earrings https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/silver-oval-earrings https://spiritsofthewestcoast.com/collections/native-earrings-and-studs-in-silver-and-gold/products/haida-eagle-spirit-silver-earrings As you can see, URL titles are unique. But I know that the content in each of those products have very similar product descriptions but not exactly. But since they have been flagged as a site issue by Moz, I am guessing that the content is 95% duplicate. So can a rel=canonical be the right solution for this type of duplicate content? Or should I be considering adding new content to each of 686 products to drop below the 95% threshold? Or another solution that I may not be aware of. Thanks in advance for your assistance and expertise! Sean
Technical SEO | | TheUpdateCompany1 -
Drupal duplicate pages
Anyone else encountered massive numbers of duplicate pages being reported on SEO Moz crawls for Drupal based sites? I assumed it was b/c there was no redirect on the print format pages, so I fixed that with a cannonical tag. But still seeing 2 or 3 duplicate pages reported for many pages. Any experience fixing this would be awesome to hear about. Thanks, Kevin
Technical SEO | | kevgrand0 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
Is using a customer quote on multiple pages duplicate content?
Is there any risk with placing the same customer quote (3-4 sentences) on multiple pages on your site?
Technical SEO | | Charlessipe0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0