Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
-
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ?
If it is Good Practice then how Google Panda or other algorithms consider it ?
-
Submitting the same press release content to multiple Press Release Submission sites is generally considered a bad practice. Duplicate content can negatively impact search engine optimization (SEO), leading to potential ranking penalties. Moreover, it diminishes the uniqueness and effectiveness of your message across platforms. 700+ Free Press Release Submission sites may also have terms of service against duplicate submissions, risking account suspension or removal. It's advisable to tailor your releases for each platform to maximize impact and avoid potential repercussions.
-
hello
-
Hi Tymen Boon,
Thanks once again for answering !!
I agree with you that prnewswire has good PA, DA even low spam score .
As you said in your previous reply that we can post on High authority sites. In case, I post a PR content on the same let say prnewswire which is a paid PR site and has more than 100 PR media network (Good+Bad)
Good PA, Good DA, Lower Alexa, Lower Spam score prnewswire overall looks best.
Now, this site distribute my content on 30-40 other sites with same content with source link. Now any time Google Panda updates and the whole network got hit badly as well as prnewswire. then is it effect my site indirectly or directly or not?
Thanks !!
-
Hi Karan,
If you check it on MOZ OSE:
The site looks ok so. The traffic they have is not very relevant in this case as you only do it for the link. I have multiple business listings on press sites and business review sites like Yelp. "What doesn't kill you makes you stronger" i presume.
Good luck with all!
Tymen
-
Hi Tymen Boon,
Thanks for your reply !!
As you said that I should post the content on my site first then I can post it on High PA/DA & lower Spam score PR sites with my site's source link.
If we see in case of most popular PR site prnewswire.com it distribute same content on multiple sites more than 30 with its source link at bottom of each PR site . In 2014 Panda 4.0 hit prnewswire badly as given in this report http://searchengineland.com/google-panda-4-0-go-press-release-sites-192789.
In this case what will you say? Thanks !!
-
Hi Karan,
With Press Releases this is always the case. In my opinion it is good as long as you put the content on your own site first. In the press release you than link to this source as you mention. I would advise to look at the link quality (spamscore and DA) of the press sites you want to send it to. It is better to have a couple very good links than a lot of low quality free publicity links.
Good luck!
Tymen
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Putting my content under domain.com/content, or under related categories: domain.com/bikes/content ?
Hello This questions plays on what Joe Hall talked about during this years' MozCon: Rethinking Information Architecture for SEO and Content Marketing. My Case:
Intermediate & Advanced SEO | | Inevo
So.. we're working out guidelines and templates for a costumer (sporting goods store) on how to publish content (articles, videos, guides) on their category pages, product pages, and other pages. At this moment I have 2 choices:
1. Use a url-structure/information architecture where all the content is placed in one subfolder, for example domain.com/content. Although it's placed here, there's gonna be extensive internal linking from /content to the related category pages, so the content about bikes (even if it's placed under domain.com/bikes) will be just as visible on the pages related to bikes. 2. Place the content about bikes on a subdirectory under the bike category, **for example domain.com/bikes/content. ** The UX/interface for these two scenarios will be identical, but the directories/folder-hierarchy/url structure will be different. According to Joe Hall, the latter scenario will build up more topical authority and relevance towards the category/topic, and should be the overall most ideal setup. Any thoughts on which of the two solutions is the most ideal? PS: There is one critical caveat her: my costumer uses many url-slugs subdirectories for their categories, for example domain.com/activity/summer/bikes/, which means the content in the first scenario will be 4 steps away from the home page. Is this gonna be a problem? Looking forward to your thoughts 🙂 Sigurd, INEVO0 -
Is it bad practice to create pages that 404?
We have member pages on our site that are initially empty, until the member does some activity. Currently, since all of these pages are soft 404s, we return a 404 for all these pages and all internal links to them are js links (not links as far as bots are concerned). As soon as the page has content, we switch it to 200 and make the links into regular hrefs. After doing some research, I started thinking that this is not the best way to handle this situation. A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be real links. I'd love to hear input and feedback from fellow Mozzers. What are your thoughts?
Intermediate & Advanced SEO | | YairSpolter0 -
My site is always in the top 4 on google, and sometimes goes to #2\. But the site at #1 is always at #1 .. how can i beat them?
So i'm sure this is a very generic question.. of course everyone wants to be #1. We are an ecommerce web site. We have all sorts of products, user ratings, and are loved by our customers. We sell over 3 million a year. So let me give you some data.. First of all one of the sites that keeps taking the #2 or #3 spot is amazons category for what we sell.. (i'm not sure if I should say who we are here.. as I don't want the #1 spot to realize we are trying to take them over!) Amazon of course has a domain authority of 100. But they never take the #1 spot. The other site that takes the #2 and #3 spot is not even selling anything. Happens to be a technical term's with the same name wikipedia page! (i wish google would figure out people aren't looking for that!) Anyways.. every day we bouce back and forth between #4 and #2.. but #1 never changes.. Here are the stats of us verse #1 from moz: #1: Page Authority: 56.8, Root Domains Linking to page: 158, Domain Authority: 54.6: root domains linking to the root domain 1.42k my site: Page Authority: 60.6, Root domains linking to the page: 562, Domain Authority: 52.8: root domains linking to the root domain: 1.03k So they beat us in domain authority SLIGHTLY and in root domains linking to the root domain. So SEO masters.. what do I do to fix this? Get better backlinks? But how.... I can't just email GQ and ask them to write about us can I? I'm open to all things.. Maybe i'm not using moz data correctly.. We should at least be #2. We get #2 every other day.
Intermediate & Advanced SEO | | 88mph0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Microsite as a stand-alone site under one domain and sub-domained under another: duplicate content penalty?
We developed and maintain a microsite (example: www.coolprograms.org) for a non-profit that lives outside their main domain name (www.nonprofit-mainsite.org) and features content related to a particular offering of theirs. They are utilizing a Google Grant to run AdWords campaigns related to awareness. They currently drive traffic from the AdWords campaigns to both the microsite (www.coolprograms.org) and their main site (www.nonprofit-mainsite.org). Google recently announced a change in their policy regarding what domains a Google Grant recipient can send traffic to via AdWords: https://support.google.com/nonprofits/answer/1657899?hl=en. The ads must all resolve to one root domain name (nonprofit-mainsite.org). If we were to subdomain the microsite (example: coolprograms.nonprofit-mainsite.org) and keep serving the same content via the microsite domain (www.coolprograms.org) is there a risk of being penalized for duplicate content? Are there other things we should be considering?
Intermediate & Advanced SEO | | marketing-iq0 -
How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location. What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
Intermediate & Advanced SEO | | JChronicle0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0