Is it safe to publish 3 paid press releases in a single day?
-
Is it a safe bet to publish 3 PAID press releases (on PRleap.com) on the same website on the same day, each having about 10 links to different pages of the same website?
I mean... will search engines spot something fishy is going out there?
-
Thanks for the awesome suggestions Andrew.
The reason why I am asking is that my subscription with them ends on 10th. So either I spend a couple of hundred dollars to extend that subscription for another month (which I don't want to do) or I post my 3 PRs in the next two days to finish my quota for this month.
Now that you tell me its safe, I'll go ahead and post them. All these three are written by myself, are fresh, genuine and meaningful. So, backed by your suggestion and experience, I shouldn't worry at all and shoot them tomorrow/day after.
thanks so much,
KS__
-
Hi Ks__,
As long as they are genuine, well written press releases, you have nothing to worry about. Infact, even if you did use the same PR content across multiple PR submission sites, you would't get into any trouble as such - the search engines would just ignore the duplicate content and not pass any value to your website.
You can imagine how some sites which appear on the front page of Digg/StumbleUpon can potentially receive tens of thousands of links over night - and they rank fine.
That said, does a single PR really need over 10 ten different links? You might want to consider condensing that number to provide a more concise call to action for your readers, and spread the PR submissions out over a few days - don't give away everything all at once!
Hope that helps.
Cheers,
Andrew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keywords in URL: sub-directory or single layer keywords?
Hi guys, im putting together a proposal for a new site and trying to figure out if it'd be better to (A) have a keyword split across multiple directories or duplicate keywords to have the keyword hyphenated? For example, for the topic of "Christmas decor" would you use; (A) - www.domain.com/Christmas/Decor (B) - www.domain.com/Christmas/Christmas-Decor in example B the phrase 'Christmas' is duplicated which looks a little spammy, but the key term "Christmas decor" is in the URL without being broken up by directories. which is stronger? Any advice welcome! Thanks guys!
Intermediate & Advanced SEO | | JAR8971 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Is 301 redirecting your index page to the root '/' safe to do or do you end up in an endless loop?
Hi I need to tidy up my home page a little, I have some links to our index.html page but I just want them to go to the root '/' so I thought I could 301 redirect it. However is this safe to do? I'm getting duplicate page notifications in my analytic reportings tools about the home page and need a quick way to fix this issue. Many thanks in advance David
Intermediate & Advanced SEO | | David-E-Carey0 -
Easiest way to disavow single links on an on-going basis?
We frequently get random super-sketchy looking blogs linking to us with no author or contact information. I believe we are being targeted by a competitor setting up garbage links to us. I am hoping to use the Google disavow links tool to deal with this but is it: Safe to use or does it flag us as link spammers by using it Possible to use on an on-going basis for single links (as them come in, as opposed to a bunch of backlogged links) Thanks!
Intermediate & Advanced SEO | | BlueLinkERP0 -
Releasing Multiple Language Blog Articles ?
I was hoping anyone could give me some advice on my situation Our blog is a huge traffic source for us, we frequently release fresh blog articles on our English language website bringing lots of relevant traffic for a variety of different relevant topics Some of these articles would be very useful and relevant for visitors to our German website so i would like to get them translated and posted on our separate German language blog on our separate German website. The article text will not change much as the information is the same for Germany also How should i go about this without running into duplicate content issues with Google I looked into rel=alternate and realized that i cannot use this over two separate websites, i also thought about rel=canonical but it doesn't look like this would be suitable either Can anybody please give me any advice or thoughts on this ?
Intermediate & Advanced SEO | | Antony_Towle0 -
Post your 3 best ways to rank well on Google
Hi, Anyone care to share what are your 3 best ways to rank well on Google? As for me i think: 1.) Link building & Social Media 2.) Onsite optimization 3.) Quality Content What about you?
Intermediate & Advanced SEO | | chanel270 -
Penguin or paid link penalty, or both?
Hello, I have a site, macpokeronline.com, that has seen dramatic decrease in visitors in the last few months, it has went down from 800 per day to 200 per day. It is a pretty complex situation. The site owner purchased paid links from reputable mac sites for years (they were more of followed advertisements, but were only there for SEO Purposes), now that i'm going through the link profligate ins OSE, I can see that a majority of their links come from these sites. There is also a branding issue, there are almost 15,000 links with the anchor text of "macpokeronline.com" These are obviously branded links, I don't know the best way to deal with them (though the majority are coming from the paid link sites) We have just sent the request in to remove the paid links from the sites, and i'm guessing since he is paying over $1000 a month for the links, they will be removed quickly. The site has been receiving significantly less traffic since penguin (apr 24-25) We received a message on July 19th which was the generic unnatural link warning, saying that once we remove links make a reconsideration request. Then on July 23rd, we received another message that says they are taking a "very targeted action on the unnatural links instead of your site as a whole" which I have never seen before. This damage was done before I was hired by this client, I just want to get his traffic back up so I can help him even further, I want to know more about the steps I should take. 1. I will definitely remove the paid ads What else should I do, thanks Zach
Intermediate & Advanced SEO | | BestOdds0 -
Consolidating 3 regional domains
We recently took the decision to consolidate 3 domains for .com.au, .eu and .us. This decision was made before I arrived here and I'm not sure it's the right call. The proposal is to use a brand new .co (not .com isn't available) domain. The main reason is in terms of trying to build domain strength towards one domain instead or trying to grow 3 domains. We re-sell stock simlar to hotel rooms (different industry) and our site is heavily search based. So duplicate content is an issue that we hope improve on with this approach. One driver was we found for example that our Autralian site was outranking out european site in european searches. We don't want to only hold certain inventory on certain sites either because this doesn't work with our business rules. Anyway if we are to go about this, what would be the best practise in terms of going about this. Should we suddenly just close one of the domain and to a * 301 redirect or should we redirect each page individually? Someone has proposed using robots text to use a phased approach, but to my knowledge this isn't possible with robots.txt, thought a phased individual page 301 using htaccess may be possible? In terms of SEO is 1 domain generally better that 3? Is this a good strategy? What's the best 301 approach? Any other advice? Thanks J
Intermediate & Advanced SEO | | Solas0