Does onsite content updates have an effect on SERPs?
-
Hi,
Some might see this as a very (VERY) basic question but wanted to drill down into it anyway.
Onsite content: Lets say you have a service website and attached to it is a blog, the blog gets updated every other day with 500 words of relevant content, containing anchor text links back to a relevant page on the main website.
Forget about social signals and natural links being built from the quality content, will adding the content with anchor text links be more beneficial then using that content to generate links through guest blogging?
10 relevant articles onsite with anchor links, or 10 guest posts on other websites?
I guess some might say 5 onsite and 5 guest posts.
-
What I like to do is occasionally research, work on, and perfect a solution to a problem.
Then I will have plenty of material for both internal and external content. You can first write a blog post or article on your site. And edit it a bit and have it looking good.
Then show it as the example to other webmasters and podcast hosts.
They get an idea of how you feel, know if they want your content on the matter on their site and then there is a natural page to link to for any story they do on you.
It seems like a good way to go and this then has otehr strong pages with links to you that are ontopic, relevant, and different content than the first article.
The second, third, or fourth article get easier and easier to write. And doing a podcast is a walk in the park after all the research and reporting.
-
Thanks nakul,
I do use guest blogging as my main way of building SERPs as ive had some great success with it in the past and current.
But I think it's time to start mixing it up with social signals, directory subs etc, I dont expect this kind of link building to do much it might just open my link profile up a bit and fingers crossed the diversity alone helps the SERPs.
-
Depending upon your link profile and how you currently rank in the SERPS, your best best is to do both. YOu are right thinking about it...both internal links as well as external links are important and some strong guest posts from niche blogs would certainly help. I would also not overdo just the guest posts kinds of links. Do other kinds of link building as well. For example, provide a link to this article code on your blog, which might invoke readers and other webmasters possibly linking to you from their blogs or discussing your article on a niche forum.
I hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Frequent Blog Content-Effective in Improving Ranking?
To what extent will posting quality blog posts 2 to 3 times per week have the following effects: 1. Improve ranking for specific keywords?
Intermediate & Advanced SEO | | Kingalan1
2. Create backlinks to our website
3. Increase MOZ domain authority
4. Increase organic search traffic Assume that the blog posts are geared towards answering user inquiries and are also posted on our social media accounts. Would such an approach be better than engaging in a link building campaign, in the sense that the links will be created organically by users that want to link to our site? Thanks,
Alan0 -
Change of content
Hello, When you do a major change of content on a page I know it takes time to start seeing some results in terms of ranking. Let's say I make a change today expecting to see the first results of that change 2 months from now. Let's say in a month I decide to add some content and make again some minor changes. Do I have to wait another 2 months starting on the date I made my 2 nd changes to see some results or will I see the results of the 1 change as originally planned 2 months after my major content change ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Does Google see this as duplicate content?
I'm working on a site that has too many pages in Google's index as shown in a simple count via a site search (example): site:http://www.mozquestionexample.com I ended up getting a full list of these pages and it shows pages that have been supposedly excluded from the index via GWT url parameters and/or canonicalization For instance, the list of indexed pages shows: 1. http://www.mozquestionexample.com/cool-stuff 2. http://www.mozquestionexample.com/cool-stuff?page=2 3. http://www.mozquestionexample.com?page=3 4. http://www.mozquestionexample.com?mq_source=q-and-a 5. http://www.mozquestionexample.com?type=productss&sort=1date Example #1 above is the one true page for search and the one that all the canonicals reference. Examples #2 and #3 shouldn't be in the index because the canonical points to url #1. Example #4 shouldn't be in the index, because it's just a source code that, again doesn't change the page and the canonical points to #1. Example #5 shouldn't be in the index because it's excluded in parameters as not affecting page content and the canonical is in place. Should I worry about these multiple urls for the same page and if so, what should I do about it? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0 -
Internal structure update
How often does google update the internal linking structure of a website ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Panda Update - Challenge!
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas. Here are a few details to give you some background. The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place. I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues. The only thing I can come up with is that it is either: Something off-page related to links or Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google. If anyone has ideas or would like more info to help please send me a message. I greatly appreciate any feedback. Thank you, friends! LHC
Intermediate & Advanced SEO | | lhc670