Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
-
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ?
If it is Good Practice then how Google Panda or other algorithms consider it ?
-
Submitting the same press release content to multiple Press Release Submission sites is generally considered a bad practice. Duplicate content can negatively impact search engine optimization (SEO), leading to potential ranking penalties. Moreover, it diminishes the uniqueness and effectiveness of your message across platforms. 700+ Free Press Release Submission sites may also have terms of service against duplicate submissions, risking account suspension or removal. It's advisable to tailor your releases for each platform to maximize impact and avoid potential repercussions.
-
hello
-
Hi Tymen Boon,
Thanks once again for answering !!
I agree with you that prnewswire has good PA, DA even low spam score .
As you said in your previous reply that we can post on High authority sites. In case, I post a PR content on the same let say prnewswire which is a paid PR site and has more than 100 PR media network (Good+Bad)
Good PA, Good DA, Lower Alexa, Lower Spam score prnewswire overall looks best.
Now, this site distribute my content on 30-40 other sites with same content with source link. Now any time Google Panda updates and the whole network got hit badly as well as prnewswire. then is it effect my site indirectly or directly or not?
Thanks !!
-
Hi Karan,
If you check it on MOZ OSE:
The site looks ok so. The traffic they have is not very relevant in this case as you only do it for the link. I have multiple business listings on press sites and business review sites like Yelp. "What doesn't kill you makes you stronger" i presume.
Good luck with all!
Tymen
-
Hi Tymen Boon,
Thanks for your reply !!
As you said that I should post the content on my site first then I can post it on High PA/DA & lower Spam score PR sites with my site's source link.
If we see in case of most popular PR site prnewswire.com it distribute same content on multiple sites more than 30 with its source link at bottom of each PR site . In 2014 Panda 4.0 hit prnewswire badly as given in this report http://searchengineland.com/google-panda-4-0-go-press-release-sites-192789.
In this case what will you say? Thanks !!
-
Hi Karan,
With Press Releases this is always the case. In my opinion it is good as long as you put the content on your own site first. In the press release you than link to this source as you mention. I would advise to look at the link quality (spamscore and DA) of the press sites you want to send it to. It is better to have a couple very good links than a lot of low quality free publicity links.
Good luck!
Tymen
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Mobile site crawl returns poorer results on 100% responsive site
Has anyone experienced an issue where Google Mobile site crawl returns poorer results than their Desktop site crawl on a 100% responsive website that passes all Google Mobile tests?
Intermediate & Advanced SEO | | MFCommunications0 -
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
23k Links from one doman pointing to a single page, good or bad?
Hey all, So I found a domain that GWT tells me has 23k links pointing to a landing page. I found that the link is part of their global nav as a text ad and that's why it's probably registering so many links. The site has a DA of 56, is this a bad thing? Could it be hurting the rest of my site's ability to rank? Thanks, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing0 -
If I put a piece of content on an external site can I syndicate to my site later using a rel=canonical link?
Could someone help me with a 'what if ' scenario please? What happens if I publish a piece of content on an external website, but then later decide to also put this content on my website. I want my website to rank first for this content, even though the original location for the content was the external website. Would it be okay for me to put a rel=canonical tag on the external website's content pointing to the copy on my website? Or would this be seen as manipulative?
Intermediate & Advanced SEO | | RG_SEO1 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Sites with dynamic content - GWT redirects and deletions
We have a site that has extremely dynamic content. Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL. After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up. The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages. We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page. Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most. What would you do to avoid Google thinking its a poorly maintained site?
Intermediate & Advanced SEO | | ozgeekmum0 -
How good or bad are the free word press themes for SEO purposes?
I was wondering if the free word press themes would suffice as long as the plugins were added for seo purposes?
Intermediate & Advanced SEO | | bronxpad0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0