Does onsite content updates have an effect on SERPs?
-
Hi,
Some might see this as a very (VERY) basic question but wanted to drill down into it anyway.
Onsite content: Lets say you have a service website and attached to it is a blog, the blog gets updated every other day with 500 words of relevant content, containing anchor text links back to a relevant page on the main website.
Forget about social signals and natural links being built from the quality content, will adding the content with anchor text links be more beneficial then using that content to generate links through guest blogging?
10 relevant articles onsite with anchor links, or 10 guest posts on other websites?
I guess some might say 5 onsite and 5 guest posts.
-
What I like to do is occasionally research, work on, and perfect a solution to a problem.
Then I will have plenty of material for both internal and external content. You can first write a blog post or article on your site. And edit it a bit and have it looking good.
Then show it as the example to other webmasters and podcast hosts.
They get an idea of how you feel, know if they want your content on the matter on their site and then there is a natural page to link to for any story they do on you.
It seems like a good way to go and this then has otehr strong pages with links to you that are ontopic, relevant, and different content than the first article.
The second, third, or fourth article get easier and easier to write. And doing a podcast is a walk in the park after all the research and reporting.
-
Thanks nakul,
I do use guest blogging as my main way of building SERPs as ive had some great success with it in the past and current.
But I think it's time to start mixing it up with social signals, directory subs etc, I dont expect this kind of link building to do much it might just open my link profile up a bit and fingers crossed the diversity alone helps the SERPs.
-
Depending upon your link profile and how you currently rank in the SERPS, your best best is to do both. YOu are right thinking about it...both internal links as well as external links are important and some strong guest posts from niche blogs would certainly help. I would also not overdo just the guest posts kinds of links. Do other kinds of link building as well. For example, provide a link to this article code on your blog, which might invoke readers and other webmasters possibly linking to you from their blogs or discussing your article on a niche forum.
I hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Volatile SERPS?
I have a new client whose rank is jumping around a lot. I took over the account 7 weeks ago, but they'd been doing SEO with a decent provider. They switched to me because they thought he wasn't working very hard and he's not very nice. I just took them on in December and they started out at #18. On Dec 24 they were at 9, Jan 1 at 12, Jan 2 at 14, Jan 3 at 24, Jan 12 at 19, Jan 15 at 28, Jan 16 at 25, Jan 19 at 28 and today at 24. Some of the really bad black hattish providers have bounced up to the top. Any idea what's going on?
Intermediate & Advanced SEO | | julie-getonthemap0 -
Search Causing Duplicate Content
I use Opencart and have found that a lot of my duplicate content (mainly from Products) which is caused by the Search function. Is there a simple way to tell Google to ignore the Search function pathway? Or is this particular action not recommended? Here are two examples: http://thespacecollective.com/index.php?route=product/search&tag=cloth http://thespacecollective.com/index.php?route=product/search
Intermediate & Advanced SEO | | moon-boots0 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
On Page Content. has a H2 Tag but should I also use H3 tags for the sub headings within this body of content
Hi Mozzers, My on page content comes under my H2 tag. I have a few subheadings within my content to help break it up etc and currently this is just underlined (not bold or anything) and I am wondering from an SEO perspective, should I be making these sub headings H3 tags. Otherwise , I just have 500-750 words of content under an H2 tag which is what I am currently doing on my landing pages. thanks pete
Intermediate & Advanced SEO | | PeteC120 -
How to deal with URLs and tabbed content
Hi All, We're currently redesigning a website for a new home developer and we're trying to figure out the best way to deal with tabbed content in the URL structure. The design of the site at the moment will have a page for a development and within that you can select your house type, then when on the house type page there will be tabs displayed for the user to see things like the plot map, availability and pricing, specifications, etc. The way our development team are looking at handling this is for the URL to use a hashtag or a query string at the end of it so we can still land users on these specific tabs for PPC for example. My question is really, has anyone had any experience with this? Any recommendations on how to best display the urls for SEO? Thanks
Intermediate & Advanced SEO | | J_Sinclair0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0 -
Why is noindex more effective than robots.txt?
In this post, http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo, it mentions that the noindex tag is more effective than using robots.txt for keeping URLs out of the index. Why is this?
Intermediate & Advanced SEO | | nicole.healthline0