Publishing Press Releases after Google Panda 2.5
-
For the past few years I have been publish press releases on my site for a number of business. I have high traffic on my site.
I noticed that with the Google Panda 2.5 update PRNewswire.com dropped visibility by 83%.
Should I stay away from publishing press releases now? Does Google consider Press Releases to be "content scraping" since multiple sources are publishing the release?
-
In short, yes.
Don't get me wrong. It is perfectly fine to publish press releases on your website. Google isn't going to lose trust in your website from selected content that might be syndicated elsewhere. Give them a little credit. Google certainly recognizes that press releases, however boring and uninspired, make up a necessary part of how companies communicate, and thus a necessary segment of content on the web. Their algorithm, for all of its faults, is not completely without common sense. Moreover, having selected content on your website that is duplicated elsewhere isn't necessarily a dealbreaker. Google, particularly with Panda, seems to evaluate the trust of your website as a sitewide sort of factor, and a couple pieces of duplicate content aren't necessarily going to mess up your day.
But if press releases do make up a significant portion of the content on your website, I would say you're definitely at risk. Google will learn that your website isn't dishing out unique content, and they'll trust your website less because of it. Moreover, press releases aren't exactly the world's best type content, even if they are completely original and don't exist elsewhere on the internet. The fact is that there are much better opportunities to create original, engaging, and shareable content, and taking advantage of those opportunities will help your SEO efforts much more than any amount of press releases you could possibly publish.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do industry partner links violate Google's policies?
We're in the process of The Great _Inquisition_piecing together a reconsideration request. In doing so, we reached out to an agency to filter and flag our backlinks as safe, should be no-followed, or should be removed. The problem is, they flagged several of our earned, industry partner links (like those pointing to us, HireAHelper, from 1-800-Pack-Rat and PODS for example) as either should be no-followed or should be removed. I have a hard time believing Google would penalize such a natural source of earned links, but then again, this is our second attempt at a Reconsideration Request, and I want to cover all my bases. What say you Moz community? No-follow? Remove? Leave alone?
White Hat / Black Hat SEO | | DanielH0 -
The wrath of Google's Hummingbird, a big problem, but no quick solution?
One of our websites has been wrongfully tagged for penalty and has literally disappeared from Google. After lot's of research, it seems the reason was due to a ton of spammy backlinks and irrelevant anchor text. I have disavowed the links, but the results are still not rebounding back. Any idea how long the wrath of Google gods will last?
White Hat / Black Hat SEO | | Mouneeb0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Google profile
I have a google profile https://plus.google.com/u/0/106631271958142100588/ wich is assigned to the url www.propdental.es but i also write a lot of content for to others url My question is if should i create another profile to the others urls witch are also mine but not associated between them. Or can i use the same profile without the risk of losing ranking on the weakest url, as they all compete for similiar keywords Thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
Google Local Listing Verification - Is there a way to skip this?
Hi, We are running 2 types of service in our company. 1.) Dry Cleaning 2.) Laundry Services The problem is we have 2 website but only 1 office address.
White Hat / Black Hat SEO | | chanel27
It is not recommended to put same address for the both websites
both doing laundry & dry cleaning services. Is there any tip on how we can get listed on Google place without using the same address for both website?0 -
Someone COPIED my entire site on Google- what should I do?
I purchased a very high ranked and old site a year or so ago. Now it appears that the people I purchased from completely copied the site all graphics and content. They have now built that site up high in rankings and I dont want it to compromise my site. These sites look like mirror images of each other What can I do?
White Hat / Black Hat SEO | | TBKO0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0