SEO problems with PR Newswires
-
Just been investigating PR newswires for the first time (despite having worked in PR for over a decade!)
One of my clients has asked my to send out a news release via a newswire of my choice. I will not be posting the news release on my client's website, to avoid the most obvious duplication issue.
Has anyone had SEO probs from newswires though? I just saw one which offered: "Minimum guaranteed number of media websites on which your release is posted" alarm bells!
-
I've investigated a huge list of newswires. I really like the look of a couple of heavily business oriented wires - very regulated and unspammy, so for business stories I will be using those. I found one newswire which is very aware of the SEO implications of too much syndication to awful sites - they appear only to syndicate to a handful of reasonable quality sites. Others do much heavier syndication. I will stick with the 3 that appear to be safest for now...
-
I've investigated a huge list of newswires. I really like the look of a couple of heavily business oriented wires - very regulated and unspammy, so for business stories I will be using those. I found one newswire which is very aware of the SEO implications of too much syndication to awful sites - they appear only to syndicate to a handful of reasonable quality sites. Others do much heavier syndication. I will stick with the 3 that appear to be safest for now...
-
This was interesting http://searchengineland.com/how-prweb-helps-distribute-crap-into-google-news-sites-140597 - I think there's a level of nervousness about what happens further down the line. Interesting times. Think I will hang fire for the time being, and explain the issues to my client.
Thanks for useful feedback guys - appreciated.
-
I'm still testing my release I sent out in November. Some observations:
- Stable traffic of 10-15 visitors each day directly from the PR site where the release was done
- Incoming do-follow links from HIGHLY reputable newspapers (syndicated though, not reporters)
- Ranking increased for the two keywords included in the release
I did not see any negative effect ... yet.
If I do, I'll let you know.
Anyone else has done a test like that recently? Please share
-
"Minimum guaranteed number of media websites on which your release is posted" alarm bells!"
Not necessarily an "Alarm Bell" on its own, as that really is all syndication is technically -
BUT what this probably means is a HUGE group of websites that just dynamically place any niche relevant content on the sites of that niche. It is most usually all no follow, so penalties would not be felt - but results might not be felt as well. (Unless it is real news and you are just trying to gain exposure and traffic)
I am not the biggest fan of online press releases so I am probably jaded, but from an SEO standpoint it is not necessarily an alarm bell for something bad. It just means your content will be automatically accepted on the distribution partner sites (as long as it passes quality checks for most reputable release agencies)
Shane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links with malicious anchor text. Negative seo attack
Hi, What to do with more than 300 links with a malicious anchor text that has nothing to do with my content. I am disavowing those links for the last 5 years. Some of them are directed to URLs that have been changed more than 8 years ago. How can I block this malicious behavior? Thanks in advance
White Hat / Black Hat SEO | | Arlinaite470 -
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
Best tips needed to compete in SEO industry? (Thank you in advance)
Hello Moz Friends, So I wanted to ask for your friendly tips. Im in Colorado and my competition has business names like Colorado SEO and then one company owns like 5 of the top 10 Google ranked sites under different names. Im an honest guy, but how does someone compete in a crazy competitive industry? How about you? Did you start at the very bottom and never got to the top? Or did you outrank the leaders? I know seo people are smart, but it's easy to wonder if there is any room left? So just wondering your success or failure stories with competing in a competitive market online Any tips are appreciated! Chris
White Hat / Black Hat SEO | | asbchris0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
What if White Hat SEO does not get results?
If company A is paying 5k a month and some of that budget is buying links or content that might be in the gray area but is ranking higher than company B that's following the "rules" and paying the same but not showing up at all, what's company B suppose to do?
White Hat / Black Hat SEO | | EmarketedTeam2 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Could this be negative SEO?
Hi, I've attached a copy of our Google ranking for one of our keywords for our site and a competitor. Also shown is the number of external links over time for the same 2 sites. There seems to be a striking resemblance between the 2 sites so could this be the result of negative SEO? What's the best way to determine whether you've been targeted for negative SEO? Thanks, site-analysis.jpg
White Hat / Black Hat SEO | | AndyMediaLounge0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0