What if a site has links from news sites with the same/similar content like a press release? is that ok?
-
Thanks in advance!
-
I like what David and Samuel have to say here. There is and always will be room for press releases when it comes to spreading PR and information. Businesses will continue to put releases out there and often link back to themselves, but they should be sure to link with brand terms, generic words ("find out more", etc.), or URLs (similarly to the way people generally build links now). You wouldn't put out a press release linking back to yourself with "car insurance" and spread it to 150 different sources now, if you knew what was good for you.
News stories and releases are always going to get picked up and spread, but what Google is looking for when it comes to actually hurting sites with links like this is a lack of a natural pattern. Does the site receive next to no media attention, but suddenly has 500 links from an identical piece of copy, whilst also receiving no new social media attention, no additional coverage (e.g. no one has taken the press release and written their own story about it nor conducted an interview with a company representative)? The pattern there is unnatural and warrants further investigation.
Is the company regularly being cited, mentioned and written about? Does it put out a release about a real new product or development and have that release picked up by real news sources, some of whom put their own thoughts on the web about the company development? This is natural-looking.
I hope this makes some sense. Essentially the goal is to spread information in the way you would if Google was not an issue, with the resulting coverage being beneficial to your SEO efforts nonetheless. Putting out press releases about nothing and expecting links back from newswires, etc. isn't a brilliant idea but using press releases for PR can be very beneficial for SEO when done properly.
-
Press releases are a common misunderstood concept in SEO.
The main purpose of a press release should not be to gain links, at least not directly. (before any calls blasphemy, hear me out.)
"Remember: As I wrote on Moz, press releases and related items should be used to get coverage, not links." Samuel, completely agree.
A press release should be used to promote useful, or "newsworthy" content about your business. By having your news or PR listed on a site, you are promoting info, not trying to gain links. Press releases net you additional traffic by having people interested in what you have to say, not simply because there is a link on a major site. We wrote a bit about this on our blog: http://www.webdesignandcompany.com/10-old-seo-strategies-you-should-stop-using
"Creating press releases just for the sake of links alone is not a good practice, and can become expensive if done regularly. A good press release is one that can help other people. Remember, humans are social creatures, and when we find something that helps us, we share it. By having people share your information and providing something that is truly newsworthy and good, this can help your media relations be more effective, and believe me, the links will follow. Think long term, not quick fix. Zach Cutler wrote a good article for The Huffington Post about 8 Tips for Writing a Great Press Release."
My second question is, why would there be a concern of duplicate content? If you create a press release, that info should not be directly matched to any content on your site. If it is, more than likely it is not "newsworthy" content, but something that was created from your site content to gain backlinks or additional rank. If you have content somewhere on your site that directly matches the info in your press release, then I would revise it so that the subject matter and keyword focus stays the same, but the content is varied.
-
It depends what you mean by "OK." Remember: As I wrote on Moz, press releases and related items should be used to get coverage, not links. The coverage is what then indirectly "earn" links from quality, authoritative outlets. If you publish releases on countless press-release sites (especially with exact-match anchor text!) -- then are you are just "building" artificial links that can quickly get you into trouble if you have too many of them or too much of the same anchor text.
A few resources: Matt Cutts, the head of Google's web-spam team, says that press release links will not (directly) help your rankings. Search Engine Watch has five tips on using press releases for links.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I remove Japanese hacker in my site
Hello, How can I remove Japanese hacker in my site?? here i have attached screen shot for it , http://prntscr.com/cmxmmx My website is hacked From long, please help out to solve this Problem Thnx in advance
White Hat / Black Hat SEO | | poojaverify060 -
Boosting Equity-Passing Links?
Hello Moz folks, We have a SEO client who has exponentially fewer equity-passing links(inbound and internal) than their two major competitors, which I'm sure is a MAJOR factor in their rankings. In fact, the numbers are so drastically different seems to indicate that these competitors are participating in some sort of black hat link farm. For example: Internal and Inbound Equity-Passing Links Our client - 2274 Competitor 1 - 496k Competitor 2 - 143k How is this possible or legit? I don't understand. Our well-known client has been in business for 10+ years and they have a content-rich, WordPress website consisting of thousands of pages that have been optimized for search, including keyword-rich URLs, page titles, metas, H1 tags, etc. The things that keep coming to mind are the need for more links and more content. One thing that comes to mind is that the client launched a new site about 1.5 years ago and changed their domain prefix from http to https. I'm not sure if this would have an impact on inbound link equity or not. 301 redirects are in place so from what I understand, all of the old http pages should have passed at least partial domain equity to the new https site. I'm also wondering if changing the structure of WordPress categories, tags and author pages could somehow dynamically increase the page count and amount of perceived content. We may be overly restrictive with Google Search Console. Anyway, I'm at a loss and don't understand how our competitors, with seemingly similar content, could have exponentially more links and are dominating the search results. Thanks for your help and sage advice. Your input is very much appreciated. Eric pSzXl
White Hat / Black Hat SEO | | EricFish0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Embedded links/badges
Hi there Just picking up on something Rand said in his blog analysing his predictions for 2014. Rand predicted that Google will publicly acknowledge algorithmic updates targeting...embeddable infographics/badges as manipulative linking practices While this hasn't exactly materialised yet, it has got me thinking. We have a fair few partners linking to us through an embedded badge. This was done to build the brand, but the positives here wouldn't be worth being penalised in search. Does anyone have any further evidence of websites penalised for doing this, or any views on whether removing those badges should be a priority for us? Many thanks
White Hat / Black Hat SEO | | HireSpace0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority
According to my recent SEOmoz links analysis, my competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority... e.g. wedding site linking to a transportation attorney's website. Aother competitor site has an overall of 2 million links, most of which are seemingly questionable index sites or forums to which registration is unattainable. I recently created a 301 redirect, and my external links have yet to be updated to my new domain name in SEOmoz. Yet, by comparing my previous domain authority rank with those of the said competitor sites, the “delta” is relatively marginal. The SEOmoz rank is 21 whereas the SEOmoz ranks of two competitor sites 30 and 33 respectively. The problem is, however, is to secure a good SERP for the most relevant terms with Google… My Google pagerank was “3” prior to the 301 redirect. I worked quite intensively so as to receive a pagerank only to discover that it had no affect at all on the SERP. Therefore, I took a calculated risk in changing to a domain name that translates from non-latin characters, as the site age is marginal, and my educated guess is that the PR should rebound within 4 weeks, however, I would like to know as to whether there is a way to transfer the pagerank to the new domain… Does anyone have any insight as to how to go about and handling this issue?
White Hat / Black Hat SEO | | eranariel0 -
How to recover my site from -50 penalty
One of my sites was hit after Google confirmed its panda 3.2 update. The site ranked very well for many heavy traffic keywords in my niche. But all of a sudden, 80% of the keywords which ranked high in the previous dropped 50 in SERP. I know it is a -50 penalty , but i do not know how to recover from it. The link building campaign is almost the same as before and all of the articles are unique. BTW, i have two image ads on the sidebar and 7 affiliate links on the bottom of the page. Any input will be great appreciated !
White Hat / Black Hat SEO | | aoneshosesun0 -
If a site is punished by google like -30, or -60, are the link from that site efficient?
Like this way, if I build a blog and in some situation, the blog is punished by google as some reason I don't know, all the rank dropped and got the -30 punishment. If I put a outbound link on the sidebar, or footer position. what it'll be for that link? A is punished, a link is put on the A website and link to B website what that link means to B punished got many ways Thank you
White Hat / Black Hat SEO | | yifang01230