Fresh Content Still As important?
-
We have an internal debate, that perhaps y'all can help us resolve.
In the past "freshness" of content has been important, correct? (Google's QDF for example) In the past (to present) when we build a site with the intent to SEO the site, we build the core pages with the expectation that we will be adding more site pages as the project progresses, thus settling the "fresh content" factor.
But it has been proposed to us, from a client, that completely building the site out with all the pages you hope to rank, getting the upfront bang for your buck. The expectation is that the traffic soars right-off.
Now the client says that he has been doing this for years and has not been affected by any alog changes. (although we have not seen proof of this from him)
So our question is this: Is it better to provide a website full of fresh content at the beginning of the project, for a jumpstart on traffic, then leave the site alone ( for the most part)
or
Is it better to have core pages of fresh content at the start, and build out new pages from their, so the website remains fresh every month?
And can you prove your argument? (we need cold hard facts to be convinced
-
EGOL, a big time member on these forums posted years ago that there will be a day when the only thing that a search engine truly judges a website on is Keywords and content. Now I'm not entirely sure I'm completely on board with that (I'm about 95%), but I do agree that content, especially after the recent SE updates, has shifted back into power.
My father owns a business, we make educational materials for people with mild to severe autism. He is very successful, and he personally doesn't have the time or energy to spend in writing a daily blog, and unfortunately doesn't trust anybody to ghost write for him.
So we came up with an alternative. A combo of original content mixed with educational reports, interesting studies, and every now and then some strange funny story from theOnion. We would post at least one original piece a week, if we could 2, and then everything else from there. I made a few Bullying Infographics for his business to post and share on social media. Now, it wasn't always keyword heavy content, but as long as it was content worth sharing, it did get us a lot of links.
At the end of the day, if I have to make a decision on how Google is doing something, I try to remind myself Google is in the business of making money. They do that by providing the best, accurate, human, natural, semantic, organic, pefect-beacue-I-am-a-snowflake, result. Google, in my opinion, will take how current the website is, into account.
Content is King.
-
This is our thought as well. A continuous feed of fresh content is a better approach than a one off. This is how we've been doing it, but we're really interested in knowing if others have tried this other approach, with any lasting sustainability in traffic or ranking. ( we kind of doubt it, but would love to see proof that it works)
-
The QDF is aimed at hot/current topics right ? So while it might be important for a news site or a celebrity gossip site I don't think it will be relevant for every site.
You have mentioned that the client has proposed to build the site with "all the pages you hope to rank for", which means the topic is restrictive and there is a limit to what you can write about the subject. But then to launch the site with this approach you need to get all the content ready and that might take some time.
A much more sensible approach would be to launch the site with a reasonable amount of content and then add the rest of the content when possible. This way you can start with the link building, social sharing process early.
I don't think just because you launch a site with lots of fresh content it will give you a jump start in traffic, but I'm interested to see if anyone had success with this method.
-
Fresh content is definitely important and while you may get the boost at the start you'll quickly loose it if you're not putting up new content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How important is Lighthouse page speed measurement?
Hi, Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse). When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40. Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async. Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async? Thank you, Dave bg_05.jpg
Reporting & Analytics | | erdev0 -
Excluding Cookieless Static Content Sub-domain from GA/GTM
For the purposes of this question our ecommerce site url is www.ecommerce.com Our TLD is ecommerce.com We have, following advice from Yslow, Pagespeed and others, moved our static content to a subdomain - static.ecommerce.com We have Google Analytics and Enhance Ecommerce installed, fired from GTM. The cookieDomain setting in GTM is 'auto' At present cookies are being attached to our static resources. What changes do I need to make to to prevent this happening? Many thanks Julian
Reporting & Analytics | | jdeb0 -
Linked my adwords account to GA and vice versa and still paid search is getting recorded into organic traffic??
Hi Mozzers, I have linked properly my adwords account to GA and vice versa and somehow I can see 3/4 of this paid traffic recorded to organic search. The most confusing part is that I can see 1/4 of the paid traffic under the "paid" metric. At this point I don't know really what should I do? Thank you guys in advance!
Reporting & Analytics | | Ideas-Money-Art0 -
What is the best way to eliminate this specific image low lying content?
The site in question is www.homeanddesign.com where we are working on recovering from some big traffic loss. I finally have gotten the sites articles properly meta titled and descriptioned now I'm working on removing low lying content. The way there CMS is built, images have their own page (every one that's clickable). So this leads to a lot of thin content that I think needs to be removed from the index. Here is an example: http://www.homeanddesign.com/photodisplay.asp?id=3633 I'm considering the best way to remove it from the index but not disturb how users enjoy the site. What are my options? Here is what I'm thinking: add Disallow: /photodisplay to the robots.txt file See if there is a way to make a lightbox instead of a whole new page for images. But this still leaves me with 100s of pages with just an image on there with backlinks, etc. Add noindex tag to the photodisplay pages
Reporting & Analytics | | williammarlow0 -
Large content snippets showing up as keywords?
I've started to notice something very strange: the search keywords report in analytics show a bunch of instances where a person copied large snippets of our site content and then pasted it into the search box. Half these searches are coming from the US and half from...India. I'm worried that this may be the sign of a competitor attempting to perform negative SEO on our site (though admittedly I don't know how). Anyone seen anything like this? Advice? Thanks!!
Reporting & Analytics | | SarahLK0 -
Can someone clarify the importance of Scribe SEO?
Hi guys, I was reading The Beginner's Guide to SEO and was confused about the importance of keyword density. As I see it, the main purpose of tools like Scribe SEO revolve around analyzing keyword density, however, Chapter 9 of "The Beginner's Guide to SEO" seems to downplay its importance and says "Despite being proven untrue time and again, this myth has legs. Many SEO tools still feed on the concept that keyword density is an important metric. It's not." If this is true, what is the real value of tools like Scribe SEO? Currently, I follow keyword analysis tools very closely, and try to get the recommended density in my articles to help build back links. Should I be focusing heavily on the density and prominence of keywords like I am in the picture below, or is there another way you suggest I go about using these tools? PhJnV PhJnV
Reporting & Analytics | | samersultan10 -
Duplicate content and ways to deal with it.
Problem I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images. http://i.imgur.com/OXnPp.png Solutions: 1) Quick: Change the link on the pages above to be lowercase 2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example: http://www.darden.virginia.edu/MBA" /> ''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.'' 3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you. What do you all think?????? OXnPp voJdp.png OXnPp.png
Reporting & Analytics | | Darden0 -
Duplicate page content
I have a website which "houses" five different and completely separate departments, so the content is separated by subfolders. e.g. domain.com/department1 domain.com/department2 etc. and each have their own individual top navigation menus. There is an "About Us" section for each department which has about 6 subpages (Work for us, What we do, Awards etc.) but the problem is that the content for each department is exactly the same. The only difference is the navigation menu and the breadcrumbs. This isn't ideal as a change to one page means having to make the change to all 5 and from an SEO perspective it's duplicate content x5 (apart from the Nav). One solution I can see is to have the "About Us" section moved to the root level (domain.com/about-us) and have a generic nav, possibly with the department names on it. The only problem with this is that it disrupts the user journey if they are forced away from the department that they're chosen. Basically i'm looking for suggestions or examples of other sites that have got around this problem, I need inspiration! Any help would be greatly appreciated.
Reporting & Analytics | | haydennz0