Duplicate Content on Press Release?
-
Hi,
We recently held a charity night in store. And had a few local celebs turn up etc...
We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site.
My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok?
We will be sending approx 20 of these out, some going online and some not.
So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics.
Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs?
Cheers
-
I think the only time you need to be worried about this is if you have the exact same content published on your own website/blog.
If you do, ensure you add a
rel="author"
link (to either author page on site or Google+ profile) and that your webpage is indexed in the search engine before any other site publish identical or similar content. -
thanks for your responses. the press release hasn't been submitted to a pr website. i have emailed indivual sites/blogs and newspapers to see if they will publish the charity night we did.
so it is all real news of an event that actually happened.
cheers
-
Since you are actually distributing "Real News" and not just fluff news to target keywords for SEO - I would think "spinning" it would be worse....
But to be honest - It is a hard question to answer - as Technically a press release is not supposed to be posted on others sites word for word --- a press release is to bait journalists/bloggers to write about your story - Only once the days of easy manipulation was it turned into some other type of "link building" strategy.
I would suggest submitting the same one - But with sites like PRweb, it is hard these days to really justify it, as to be honest is it not just a big network to gain links?
IMHO of course
-
Not necessarily -
Alot of PRWebs high authority distribution network is Nofollow
Also just because it is picked up, does not mean it is not duplicate content - or will not be treated as dup content in the future
Over the past few years I have personally seen a massive depreciation of Online Press Release distribution "White Hat" SEO benefits.
I also am not sure I would agree post Panda/Penguin - that Press releases are a good way to pass "Link Juice"
I am not saying it is a lost cause, but just feel your assesment is a little circa 2009, 2010
-
I use PRWeb to distribute press releases, and while each press release gets picked up anywhere by 30 - 100 news outlets, I haven't yet noticed any issues with it being tagged as duplicated content.....it's more so a way to get link juice to your website. If multiple news sources publish the same press release it doesn't reflect negatively on your site metrics, it's a good thing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
May Faceted Navigation via ajax #parameter cause duplicated content issues?
We are going to implement a faceted navigation for an ecommerce site of about 1000 products.
Intermediate & Advanced SEO | | lcourse
Faceted navigation is implemented via ajax/javascript which adds to the URL a large number of #parameters.
Faceted pages are canonicalizing to page without any parameters. We do not want google to index any of the faceted pages at this point. Will google include pages with #parameters in their index?
Can I tell google somehow to ignore #parameters and not to index them?
Could this setup cause any SEO problems for us in terms of crawl bandwidth and or link equity?0 -
What is considered duplicate content?
Hi, We are working on a product page for bespoke camper vans: http://www.broadlane.co.uk/campervans/vw-campers/bespoke-campers . At the moment there is only one page but we are planning add similar pages for other brands of camper vans. Each page will receive its specifically targeted content however the 'Model choice' cart at the bottom (giving you the choice to select the internal structure of the van) will remain the same across all pages. Will this be considered as duplicate content? And if this is a case, what would be the ideal solution to limit penalty risk: A rel canonical tag seems wrong for this, as there is no original item as such. Would an iFrame around the 'model choice' enable us to isolate the content from being indexed at the same time than the page? Thanks, Celine
Intermediate & Advanced SEO | | A_Q0 -
Will merging sites create a duplicate content penalty?
I have 2 sites that would be better suited being merged and creating a more authoritative site. Basically I'de like to merge site A in to site B. If I add new pages from site A to Site B and create 301 redirects for those pages on site A to the new pages on Site B is that the best way to go about it? As the pages are already indexed would this create any duplicate content issue or would the redirect solve this?
Intermediate & Advanced SEO | | boballanjones0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Does a mobile site count as duplicate content?
Are there any specific guidelines that should be followed for setting up a mobile site to ensure it isn't counted as duplicate content?
Intermediate & Advanced SEO | | nicole.healthline0