How important is fresh content?
-
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
-
Hi Demi
Is the blog on a sub-domain of the main site or a separate domain.
If it is on a sub domain, then this counts as fresh content. if it is not you might want to think about moving the Blog to a sub domain.
Can you let us know
Bruce.
-
Content for the page is really where we are having a problem because we really do have most relevant topics covered. As far as blogging, that's not a problem we can always find new "news" to write about in the industry.
So I guess I am talking more about the content, and not blogs. If we aren't adding new content on a typical page, is that ok?
-
I also agree with Ray-pp & BruceA.
You CAN continue to rank well. However, I wouldn't suggest that you abandon blogging about trending topics within said websites industry. There is ALWAYS something that can be explained better/improved upon or something fresh and new to talk about with our audience.
Question: When you are asking about adding content, are you referring to your typical page content or are you referring to blogging?
-
Agreed.
Find new unique angles to keep the site content unique and fresh, For most business's seasons, annual or quarterly events either in the business or outside can provide insites and a great reason to add more current content to the existing content already provided.
If you are a bit stuck, let us a bit more about the industry so we can perhaps give a few more angles to help give you some ideas.
Bruce
-
Hi DemiGR,
Yes, your site can continue to rank well. If the topics genuinely do not warrant updated content, then link building will most likely keep you at the top of the SERPs. However, you're inviting the competition to come in and outrank you somewhat easily.
When you're not producing fresh content, competitors can easily rank by creating authority pages and then adding layers of fresh, related content. I suggest finding relevant topics to write about and point back to yoru authority pages (the topics that don't change much).
For example, is there a recent new article related to your website content? Speak on that topic and link back to the authority page that explains the topic in depth. Using current events is an easy way to add new content to your site and keep things relevant.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to unrank your content by following expert advice [rant]
Hi, As you can probably see from the title, a massive rant is coming up. I must admit I no longer understand SEO and I just wanted to see if you have any ideas what might be wrong. So, I read this blog post on MOZ https://moz.com/blog/influence-googles-ranking-factor - where the chap is improving ranking of content that is already ranking reasonably well. I've got two bits of news for you. The good news is - yes, you can change your articles' ranking in an afternoon. Bad news - your articles drop out of Top 100. I'll give you a bit more details hoping you can spot what's wrong. Disclaimer - I'm not calling out BS, I'm sure the blogger is a genuine person and he's probably has had success implementing this. The site is in a narrow but popular ecommerce niche where the Top 20 results are taken by various retailers who have simply copy/pasted product descriptions from the manufacturer's websites. The link profile strength is varied and I'm not making this up. The Top 20 sites range from DA:4 to DA:56. When I saw this I said to myself, it should be fairly easy to rank because surely the backlinks ranking factor weight is not as heavy in this niche as it is in other niches. My site is DA:18 which is much better than DA:4. So, even if I make my pages tiny tiny bit better than this DA:4 site, I should outrank it, right? Well, I managed to outrank it with really crap content. So, I got to rank two high-traffic keywords in #8 or #9 with very little effort. And I wish I stayed there because what followed just completely ruined my rankings. I won't repeat what was written in the blog. If you're interested, go and read it, but I used it as a blueprint and bingo, indeed Google changed my ranking in just a couple of hours. Wait, I lost more than 90 positions!!!! I'm now outside Top100. Now even irrelevant sites in Chinese and Russian are in front of me. They don't even sell the products. No, they're even in different niches altogether but they still outrank me. I now know exactly what Alice in Wonderland felt like. I want out please!!!!
Algorithm Updates | | GiantsCauseway0 -
Does Google giving more important to internal pages than homepage recently? Especially after the recent Major algo update?
Hi everybody, I can see the change Google brought in the SERP. Previously website homepages will be shown for primary keywords, now it's slowly and almost switched to showing most related internal pages in a website. You can check same for keyword "SEO", Most or all the results are internal pages. I can see this change for our primary keyword from last one month. So basically Google is trying to show a page explaining about the primary keywords rather than website, that's how "what is seo" pages are ranking than homepages. If there is no such pages existed or not well written, Google is just showing the website homepage. But I noticed that websites ranking with homepages are dropped compared to the websites with dedicated page about that primary keyword. Please share your thoughts. Thanks
Algorithm Updates | | vtmoz0 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
Duplicate content on a sub domain
I have two domains www.hairremoval.com and a sub domain www.us.hairromoval.com both sites have virtual the same content apart from around 8 pages and the sub domain is more focused to US customers so the spelling are different, it is also hosted in the states. Would this be classed as duplicate content ? (The url’s are made up for the question but the format is correct)
Algorithm Updates | | Nettitude0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
URL Names not so important in future?
I read somewhere (hard to say where with all the information about SEO and google!) that in the future, Google will put less importance on the URL name for ranking purposes. Any thoughts?
Algorithm Updates | | Llanero0 -
Content vs articles vs blogs is there a difference?
I was wondering is there really a difference between website content, articles or blogs and most important do search engines see it differently? My website is pretty much an ecommerce site and most of my long text is on my blog. The only other pages that have much content is the homepage, all the other pages may have a paragraph. I am just wondering if i need to make more actual pages with text/content or is having my blogs good enough? I am no expert in seo and just wondering if i am wasting too much money on getting blogs written or should i get more content. Content being a page called commercial printing and blog being a page called why do do i need commercial printing? Also would it matter to the users who find the site? Would the users come to my site just looking for information or would they actually think of me for the service?
Algorithm Updates | | topclass0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0