How many times should one submit the same article to various websites? 1 time? 10 times? What is okay to do with the most recent Panda update?'
-
For link-building purposes, seemingly it was okay to post the same article to multiple sites for links in the past. However, after the most recent Panda update our thought is that this may not be a good practice. So the question is, how many times is okay to submit an article for link building purposes. Should you always only submit to one site? Is it okay to do more than once? What is the right way to submit for link-building in Google's eyes?
Thanks
-
What about posting a short mention and link to your new article on several user forums that are related to your niche?
This form of article promotion can be quite productive. It is a form of link building.
This method would be most effective if the links are to the article on your site. I would not advise publicizing an article on a 3rd party site for the purpose of building a link to your site, then linking to that article. You can take this approach and it does have benefits, but your efforts and the link juice are diluted.
-
When we distribute press releases to news channels, google will see this as duplicate content and we only get link juice from one instance of that press release? Is that right?
If you are asking for a definitive answer, only Google knows for sure.
My best experience is it entirely depends on the publishing site. Specifically, how exactly do they present the press release.
Example 1 - A full page press release is presented as it's own web page. In this case if you have multiple sites presenting the same content the same way, it is likely Google will only index one but perhaps a second or third could be indexed depending on various circumstances.
Example 2 - The same press release is offered with open commenting allowed. Each comment adds uniqueness to the page and improves the chances it may be indexed.
Example 3 - A short press release is included on a page with other content. If the other content is unique, the page can be indexed.
Example 4 - An excerpt from the press release is shared on a page with other unique content. The page will likely be indexed.
-
So, When we distribute press releases to news channels, google will see this as duplicate content and we only get link juice from one instance of that press release? Is that right?
M
-
What about posting a short mention and link to your new article on several user forums that are related to your niche? The clicks are generally pretty good, so it's a win on that score, but I'm curious about the SEO aspects.
Best,
Christopher -
How many times should one submit the same article to various websites?
Once.
If you submit the same article to multiple sites, you are just spreading duplicate content. Ideally most of your content is published on your own site. The best reason to publish content on another site is to reach audiences you otherwise would not, and for link building.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Why is this site performing so well in the SERP's and getting high traffic volume for no apparent reason!
The site is https://virtualaccountant.ie/ It's a really small site They have only about 7 back links, They don't blog They don't have a PPC campaign They don't stand out from the crowd in terms of product or services offered So why are they succeeding in topping the SERP's for difficult to rank for accounting keywords such as accountant and online accounts. What are they doing better than everyone else, or have they discovered a way to cheat Google, and worse still - ME!
White Hat / Black Hat SEO | | PeterConnor0 -
SEOLutions - Paint it White... Has any one used?
Has anyone used the tiered link building service offered by seolutions (http://seolutions.biz/store/seo-solutions/premium-solutions-paint-it-white.html)? If so, can you provide any insight into how effective it was in the long and short term? Thanks!
White Hat / Black Hat SEO | | PeterAlexLeigh0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
Is it bad to no follow all External LInks at the same time?
I am working on more than 40 EMDs. They are good quality brand sites but they all are interlinked to each other through footer links, side bar links. (and they dont have much of linking root domains) Now Some of those sites have been renovated with new templates and these new sites has very few external links (links going out to our own sites) but some of these old sites has 100s of external links (all these external links of course link to our own sites). But anyways, we are planning to no follow all those external links (links that are linking to our own sites) slowly to avoid penalty? question is, can it be bad to implement no follow to all those links on those sites at the same time?Will Google see it as something fishy? (I don't think so) Also, Is it good strategy to no follow all of them? (I think it is) What you guys think ?
White Hat / Black Hat SEO | | Personnel_Concept0 -
Dentist office website has foreign country backlinks. Scrap it or Move On
Another SEO person who was working my potential dental customer website managed to hookup over 147 backlinks to various bogus weather sites, watch sites, chinese sites etc. The dentist owns the URL which is "dentist" plus his zipcode. Is it worth continuing SEO on this site or should I scrap the URL? I am worried that Google may take action on this site sometime in the future and all the work I will do will be lost. He does have another website, because SEO's keeps trying to sell dentists microsites... This site isn't too bad but he doesn't own the URL but the url is a combination of the two doctors names and isn't easy to remember... and we would have to spend time trying to gain control of the URL. Suggestions?
White Hat / Black Hat SEO | | Czubmeister0 -
Pages higher than my website in Google have fewer links and a lower page authority
Hi there I've been optimising my website pureinkcreative.com based on advice from SEOMoz and at first this was working as in a few weeks the site had gone from nowhere to the top of page three in Google for our main search term 'copywriting'. Today though I've just checked and the website is now near the bottom of page four and competitors I've never heard of are above my site in the rankings. I checked them out on Open Site Explorer and many of these 'newbies' have less links (on average about 200 less links) and a poorer page authority. My page authority is 42/100 and the newly higher ranking websites are between 20 and 38. One of these pages which is ranking higher than my website only has internal links and every link has the anchor text of 'copywriting' which I've learnt is a bad idea. I'm determined to do whiter than white hat SEO but if competitors are ranking higher than my site because of 'gimmicks' like these, is it worth it? I add around two blog posts a week of approx 600 - 1000 words of well researched, original and useful content with a mix of keywords (copywriting, copywriter, copywriters) and some long tail keywords and guest blog around 2 - 3 times a month. I've been working on a link building campaign through guest blogging and comment marketing (only adding relevant, worthwhile comments) and have added around 15 links a week this way. Could this be why the website has dropped in the rankings? Any advice would be much appreciated. Thanks very much. Andrew
White Hat / Black Hat SEO | | andrewstewpot0 -
Why Proved Spammers are on 1st Google SERP's Results
This question is related exclusively to few proved spammers who have gained 1st Google search results for specific terms in the Greek market, targeting Greek audience. Why he looks spammer and very suspicious? For instance, the site epipla-sofa.gr, sofa.gr, fasthosting.gr and greekinternetmarketing.com look suspicious regarding their building link activities: 1. suspicious spiky link growth 2. several links from unrelated content (unrelated blog posts forom other markets, paid links, hidden links) 3. excessive amount of suspicious link placements (forum profiles, blog posts, footer and sidebar links) 4. Greek anchor text with the keyword within articles written in foreign languages (total spam) 5. Unnatural anchor text distribution (too many repetitions) So the main question is: Why Google is unable to recognize/trace some of these (or even all) obvious spamming tactics and still these spammy sites as shwon below reside on the 1st Google.gr SERPs. Examples of spam sites according to their link building history: www.greekinternetmarketing.com www.epipla-sofa.gr www.fasthosting.gr www.sofa.gr All their links look very similar. They use probably software to build links, or even hack authority sites and leave hidden links (really dont know how they could do that). Could you please explain or share similar issues? Have you ever found any similar cases in your industry, and how did you tackle it? We would appreciate your immediate attention to this matter. Regards, George
White Hat / Black Hat SEO | | Clickwisegr0