Duplicate Content on Press Release?
-
Hi,
We recently held a charity night in store. And had a few local celebs turn up etc...
We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site.
My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok?
We will be sending approx 20 of these out, some going online and some not.
So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics.
Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs?
Cheers
-
I think the only time you need to be worried about this is if you have the exact same content published on your own website/blog.
If you do, ensure you add a
rel="author"
link (to either author page on site or Google+ profile) and that your webpage is indexed in the search engine before any other site publish identical or similar content. -
thanks for your responses. the press release hasn't been submitted to a pr website. i have emailed indivual sites/blogs and newspapers to see if they will publish the charity night we did.
so it is all real news of an event that actually happened.
cheers
-
Since you are actually distributing "Real News" and not just fluff news to target keywords for SEO - I would think "spinning" it would be worse....
But to be honest - It is a hard question to answer - as Technically a press release is not supposed to be posted on others sites word for word --- a press release is to bait journalists/bloggers to write about your story - Only once the days of easy manipulation was it turned into some other type of "link building" strategy.
I would suggest submitting the same one - But with sites like PRweb, it is hard these days to really justify it, as to be honest is it not just a big network to gain links?
IMHO of course
-
Not necessarily -
Alot of PRWebs high authority distribution network is Nofollow
Also just because it is picked up, does not mean it is not duplicate content - or will not be treated as dup content in the future
Over the past few years I have personally seen a massive depreciation of Online Press Release distribution "White Hat" SEO benefits.
I also am not sure I would agree post Panda/Penguin - that Press releases are a good way to pass "Link Juice"
I am not saying it is a lost cause, but just feel your assesment is a little circa 2009, 2010
-
I use PRWeb to distribute press releases, and while each press release gets picked up anywhere by 30 - 100 news outlets, I haven't yet noticed any issues with it being tagged as duplicated content.....it's more so a way to get link juice to your website. If multiple news sources publish the same press release it doesn't reflect negatively on your site metrics, it's a good thing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty for duplicate content on the same website?
Is it possible to get a penalty for duplicate content on the same website? I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there. Obviously I could no-follow all the filter links but it would take an enormous amount of work. The site is performing well in the search. I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
Intermediate & Advanced SEO | | seoman100 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Is all duplication of HTML title content bad?
In light of Hummingbird and that HTML titles are the main selling point in SERPs, is my approach to keyword rich HTML titles bad? Where possible I try to include the top key phrase to descripe a page and then a second top keyphrase describing what the company/ site as a whole is or does. For instance an estate agents site could consist of HTML title such as this Buy Commercial Property in Birmingham| Commercial Estate Agents Birmingham Commercial Property Tips | Commercial Estate Agents In order to preserve valuable characters I have also been omitting brand names other than on the home page... is this also poor form?
Intermediate & Advanced SEO | | SoundinTheory0 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
Do I need an invitation for Press Release Point?
Hey everyone, I went to this page http://www.pressreleasepoint.com/submit-press-release on Press Release Point and it said "Sorry, new user registration by invitation only." Does anyone know a way around it, or how I can get an invitation? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0 -
Managing Large Regulated or Required Duplicate Content Blocks
We work with a number of pharmaceutical sites that under FDA regulation must include an "Important Safety Information" (ISI) content block on each page of the site. In many cases this duplicate content is not only provided on a specific ISI page, it is quite often longer than what would be considered the primary content of the page. At first blush a rel=canonical tag might appear to be a solution to signal search engines that there is a specific page for the ISI content and avoid being penalized, but the pages also contain original content that should be indexed as it has user benefit beyond the information contained within the ISI. Anyone else running into this challenge with regulated duplicate boiler plate and has developed a work around for handling duplicate content at the paragraph level and not the page level? One clever suggestion was to treat it as a graphic, however for a pharma site this would be a huge graphic.
Intermediate & Advanced SEO | | BlooFusion380