What is the best Press Release website?
-
What is the best press release website for getting a press release in front of writers and news sources? We have used a couple with OK success, but I would like to hear and learn from the experiences of others.
Thanks
*I am not concerned with the page rank of the press release site or any SEO benefit coming from the PR website itself.
-
Yes, I believe I have experienced the opposite. I use PRWeb a lot. The link value I find is not necessarily in the press release itself; it is in the many industry specific pickups that choose to re-post a well-written article. Today, most of my press release add considerable value to my clients because they are often picked-up in full with all links kept in tact. This means my article is carried by reader choice to my niche, giving the validation that search engines are seeking.
Another key positive factor of a news worthy press release is that Google seems to understand "duplicate content" in another light. When it is considered "news" search engines anticipate the content being copied and shared.
Hope that helps.
-
No problem Alan. I knew this question would seem like the typical, "Can I rank from PR links," question.
Honestly, that is why we have never used these sites. In theory, if these site were used correctly, I think they might be valuable, but the amount of spam and hopeless SEO's that post useless content for a face value link, seem to bombard any hope of usefully content every being found or trusted by editors and writers.
I just wanted to check and see if anyone has ever experienced the opposite and had a real story or blog post come from their press release being found on a PR site.
-
OK got ya.
Most times I hear such a question it is from people that think PR sites are going to put them at #1, I should of read your question to the end.
-
Hi Alan, thanks for your response. I am not concerned with the link from the press release website. I know it is not valuable I'm just wondering if anyone has ever had a press release they published one these websites turn into an actual article or blog post on a relevant website. Currently we do not use these website. We individually reach out to new sources and blogs, but I know that there is no way we could target 100% of the relevant sources that could potential publish content. That is the only use I can see these PR websites serving for us, if they deliver on that. Also, just the fact that it is obvious tons of companies still use these websites to publish their press releases, I wanted to hear if anyone has ever experienced success with them.
-
IMO none
why would a search engine give any value to a link you can obtain yourself. they want to see that other people find your page exciting, not yourself. since Penguin I have seen better results from removing links than adding them.
-
yes, I used prweb.com & PRLOG.org they are good but I would recommend that find the best websites of your niche contact them for press release. If they agree publish there.
nodoubt that you can go for prweb.com & PRLOG.org sites but google knows these sites very well & give ranks according their preferences but if you publish your news in your niche then google finds it is relevant links.
-
Takeshi, thanks for your answer. We do take a tailored approach for local news sources and industry specific blogs. That is where we have a huge success rate, but I was more looking to see you or anyone else has had success using websites like, prweb.com or prnewswire.com. By success, I mean relevant news sources creating an article/blog posts or contacting you for a story.
Also, thanks for the book recommendation. I will definitely check that one out.
-
It really depends on what industry you are in, and what writers/news sources you are targeting.
For local news sites, it's often best to get the contact info for their tips/new submissions, and submit your tailored press release to them directly. Meeting the journalists in person will also increase your chances of them considering your news item.
Same with bloggers. Find the bloggers you're interested in, learn about what kind of content they tend to cover, and send them releases that are presented in a way that's easy for them to digest. Many times content hungry bloggers will just copy and paste your press release entirely if it's written well enough.
I would also recommend the book "Trust Me, I'm Lying" for some good advice on how to get blog/news coverage:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to become a published writer / blogger for bigger industry sites?
I'm a younger SEO manager and wanted to started establishing my name throughout the industry through well written blogs, expert articles, and any advice that is needed. I know a lot of answers will be to start my own blog and establish that and I'll get noticed, but is there a good way to make the first contact with industry sites to get an article or blog post published to their site? Or is it a pretty tight "gotta know someone" inner circle?
Industry News | | MERGE-Chicago0 -
Impact SEO when sharing with other PC urls a mobile website url
Struggle with the following impact on SEO if starting to share mobile website URL. We have multiple PC urls (5 different domains).For example www.site1.com, www.site2.com, www.site3.com, www.site4.com and www.site5.com. Now I have to convince other people within the company not to use one mobile website url for all different domains. The intention is to direct all mobile traffic from site1, site2, site3, site4 and site5 to the url: m.site4.com. Based on the following articles I already wanted to combine the www.site4.com with the m.site4.com to one entity, based on the third method which is supported by Google. https://developers.google.com/webmasters/smartphone-sites/details and http://googlewebmastercentral.blogspot.com.au/2013/06/changes-in-rankings-of-smartphone_11.html But now I have to give SEO impact feedback on the other domains. The obvious one is site1, site2, site3 and site5 will not have positive mobile SERP indexation. Second impact: Duplicated content issues across multiple websites.
Industry News | | Letty
Third impact:users from site1, site2, site3 and site5 will see the change in URL, could give a negative user experience.
Fourth impact: text on site1 / site2 / site3 / site5 is not the same as on site 4 this will implement maybe hidden content issues, which could cause penalties. Do I miss other negative SEO impact, I have been searching a lot of the articles / blogs, Q&A but can't really find sufficient information about this particular subject. Any feedback or links to articles / blogs / Q&A are highly appreciated.0 -
New TLD's released Jan 1st 2013\. Who's in ?
Hey Mozzers, Auctions for new top level domains (examples of top level domains are: .com, .net, .info, .org etc) concluded in the summer and will become available from january 1st. Did anybody bid/buy a new tld or know someone who did? And who has plans to register new exact match domains? My hobby is poker and i have a big passion for it so im already thinking about .poker domains. play.poker, texasholdem.poker etc. Here is an overview with TLD's that will come available:|
Industry News | | PrizeWize
http://www.newgtldsite.com/new-gtld-list/ I also like .amsterdam (my home town) as TLD. What are your thoughts on this? I haven't heard or read much about this in the SEO community so im interested in getting a discussion started.0 -
Huge website Search Quality Team problem
One of my clients, have a huge problem. His page which generate traffic over 3 milion unique users per month (GA) has been penalized in January / filtered by Google, reason is simple: incoming links from low quality pages... We try everything from reconsideration request to meeting Google Europe employees (But not directly with search quality team), reconsideration requests (over 10) in 6 month are bounced off we got automatic reply like "This website have still problem, read Google Webmasters FAQ etc." We contacted manually with webmasters which have websites with incoming link to us, some pepole delete with no problem, rest don't take any action to remove our links from webistes...A few days ago we use disavow tool links where we add all links which can be ignored by Google, so.. we make new recosideration data with specific information and yesterday we got the same reply like as beginning of the year.. Like this: "There are still some inorganic links to your site" What we can do? We lost a money, our client lost money, lost position, lost users... Anyone have any idea to contact with Google excluding reconsideration request form and forum?
Industry News | | thenaturat0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
What is the best site-map software?
I am in the beginning stages of building a website and would like to know the site-mapping software for organizing my company's ideas. Any recommendations are greatly appreciated. Thanks!
Industry News | | canadianplum0 -
Which one is THE BEST seo book
For you guys which one is the best SEO BOOK?? Maybe this one? http://www.amazon.com/Ranking-Number-One-Essential-Results/dp/1452849900/ref=sr_1_4?ie=UTF8&qid=1309247497&sr=8-4 Thanks in advance!
Industry News | | augustos1