Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
If I enbed the same video from my YouTube account on two different websites, will I get a duplicate content penalty?
-
I have a YouTube video I want to show my B2B and B2C customers. But I have a different websites for each. If I embed the video will I get duplicate content strike against me?
-
Thank you for the responses! Greatly appreciated. I'm just wondering if someday Google will see this as dup content...as it does written content.
Why is it that News sources can syndicate content, but if I put that content on my blog, it's duplicate content?
-
Matt's suggestion of adding unique, valuable and interesting contextual info around the video is a very good approach.
-
The best thing to do would be to add contextual information around the video. On the B2B site, your introduction or summary of the video should target that demographic. And the same for the B2C.
-
Think of it as an "viral campaign". It's not your "fault" that your video is embedded on more then one websites.
On the other hand, at this time, the search engine can not read "information" in a video. So in any case there can be no duplicate content.
-
No you won't get penalized for duplicate content because what you are doing is, in effect, syndicating content. YouTube owns the original content and will get credit for the original content. This is a similar scenario to say, a new story from Associated Press that gets picked up and published in newspapers across the country. They are syndicating, not duplicating the content. However, the credit for that content creation is retained by the original source.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites). I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors. It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble. Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do? Thanks Moz community!
On-Page Optimization | | paulz9990 -
Duplicate Content - Bulk analysis tool?
Hi I wondered if there's a tool to analyse duplicate content - within your own site or on external sites, but that you can upload the URL's you want to check in bulk? I used Copyscape a while ago, but don't remember this having a bulk feature? Thank you!
On-Page Optimization | | BeckyKey0 -
Is the HTML content inside an image slideshow of a website crawled by Google?
I am building a website for a client and i am in a dilemma whether to go for an image slideshow with HTML content on the slides or go for a static full size image on the homepage. My concern is that HTML content on the slideshow may not get crawled by Google and hence may not be SEO friendly.
On-Page Optimization | | aravinn0 -
Should I redirect mobile traffic to a different url? Will it hurt SEO?
I'm working on a site that has lots of great content and ranks well but essentially the money is generated by affiliate links. I don't have a mobile version of the site but the company I'm affiliated with does offer a mobile redirect to their domain. Will redirecting mobile traffic to a different url hurt my SEO? I think the user will get a better experience by landing on a mobile page but I don't know if google will see it like that. Any thoughts?
On-Page Optimization | | SamCUK0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Will Google penalize my website if I hide the H1 tag?
If I hide H1 tag (title on the homepage) with CSS, how Google handle with my site?
On-Page Optimization | | joeko0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0