Fresh Content Still As important?
-
We have an internal debate, that perhaps y'all can help us resolve.
In the past "freshness" of content has been important, correct? (Google's QDF for example) In the past (to present) when we build a site with the intent to SEO the site, we build the core pages with the expectation that we will be adding more site pages as the project progresses, thus settling the "fresh content" factor.
But it has been proposed to us, from a client, that completely building the site out with all the pages you hope to rank, getting the upfront bang for your buck. The expectation is that the traffic soars right-off.
Now the client says that he has been doing this for years and has not been affected by any alog changes. (although we have not seen proof of this from him)
So our question is this: Is it better to provide a website full of fresh content at the beginning of the project, for a jumpstart on traffic, then leave the site alone ( for the most part)
or
Is it better to have core pages of fresh content at the start, and build out new pages from their, so the website remains fresh every month?
And can you prove your argument? (we need cold hard facts to be convinced
-
EGOL, a big time member on these forums posted years ago that there will be a day when the only thing that a search engine truly judges a website on is Keywords and content. Now I'm not entirely sure I'm completely on board with that (I'm about 95%), but I do agree that content, especially after the recent SE updates, has shifted back into power.
My father owns a business, we make educational materials for people with mild to severe autism. He is very successful, and he personally doesn't have the time or energy to spend in writing a daily blog, and unfortunately doesn't trust anybody to ghost write for him.
So we came up with an alternative. A combo of original content mixed with educational reports, interesting studies, and every now and then some strange funny story from theOnion. We would post at least one original piece a week, if we could 2, and then everything else from there. I made a few Bullying Infographics for his business to post and share on social media. Now, it wasn't always keyword heavy content, but as long as it was content worth sharing, it did get us a lot of links.
At the end of the day, if I have to make a decision on how Google is doing something, I try to remind myself Google is in the business of making money. They do that by providing the best, accurate, human, natural, semantic, organic, pefect-beacue-I-am-a-snowflake, result. Google, in my opinion, will take how current the website is, into account.
Content is King.
-
This is our thought as well. A continuous feed of fresh content is a better approach than a one off. This is how we've been doing it, but we're really interested in knowing if others have tried this other approach, with any lasting sustainability in traffic or ranking. ( we kind of doubt it, but would love to see proof that it works)
-
The QDF is aimed at hot/current topics right ? So while it might be important for a news site or a celebrity gossip site I don't think it will be relevant for every site.
You have mentioned that the client has proposed to build the site with "all the pages you hope to rank for", which means the topic is restrictive and there is a limit to what you can write about the subject. But then to launch the site with this approach you need to get all the content ready and that might take some time.
A much more sensible approach would be to launch the site with a reasonable amount of content and then add the rest of the content when possible. This way you can start with the link building, social sharing process early.
I don't think just because you launch a site with lots of fresh content it will give you a jump start in traffic, but I'm interested to see if anyone had success with this method.
-
Fresh content is definitely important and while you may get the boost at the start you'll quickly loose it if you're not putting up new content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How important is Lighthouse page speed measurement?
Hi, Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse). When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40. Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async. Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async? Thank you, Dave bg_05.jpg
Reporting & Analytics | | erdev0 -
Important updates on Google Analytics Data Retention and the General Data Protection Regulation (GDPR)
Hi Everyone, I'm sure many of you received the email from Google over the past few days with the subject line: [Action Required] Important updates on Google Analytics Data Retention and the General Data Protection Regulation (GDPR). I hope I'm not alone in not knowing what exactly this whole notification was in regards to. I realize it's for Data but are we no longer able to pull stats from the past? If anyone has a "dumbed down" explanation for what this update entails, I would be very interested - I don't want to miss out on any important updates and info, but I'm just not grasping this content. Below is the full email in its entirety for those who are interested as well: Dear Google Analytics Administrator,
Reporting & Analytics | | MainstreamMktg
Over the past year we've shared how we are preparing to meet the requirements of the GDPR, the new data protection law coming into force on May 25, 2018. Today we are sharing more about important product changes that may impact your Google Analytics data, and other updates in preparation for the GDPR. This e-mail requires your attention and action even if your users are not based in the European Economic Area (EEA).
Product Updates
Today we introduced granular data retention controls that allow you to manage how long your user and event data is held on our servers. Starting May 25, 2018, user and event data will be retained according to these settings; Google Analytics will automatically delete user and event data that is older than the retention period you select. Note that these settings will not affect reports based on aggregated data.
Action: Please review these data retention settings and modify as needed.
Before May 25, we will also introduce a new user deletion tool that allows you to manage the deletion of all data associated with an individual user (e.g. site visitor) from your Google Analytics and/or Analytics 360 properties. This new automated tool will work based on any of the common identifiers sent to Analytics Client ID (i.e. standard Google Analytics first party cookie), User ID (if enabled), or App Instance ID (if using Google Analytics for Firebase). Details will be available on our Developers site shortly.
As always, we remain committed to providing ways to safeguard your data. Google Analytics and Analytics 360 will continue to offer a number of other features and policies around data collection, use, and retention to assist you in safeguarding your data. For example, features for customizable cookie settings, privacy controls, data sharing settings, data deletion on account termination, and IP anonymization may prove useful as you evaluate the impact of the GDPR for your company’s unique situation and Analytics implementation.
Contract And User Consent Related Updates
Contract changes
Google has been rolling out updates to our contractual terms for many products since last August, reflecting Google’s status as either data processor or data controller under the new law (see full classification of our Ads products). The new GDPR terms will supplement your current contract with Google and will come into force on May 25, 2018.
In both Google Analytics and Analytics 360, Google operates as a processor of personal data that is handled in the service.
• For Google Analytics clients based outside the EEA and all Analytics 360 customers, updated data processing terms are available for your review/acceptance in your accounts (Admin ➝ Account Settings).
• For Google Analytics clients based in the EEA, updated data processing terms have already been included in your terms.
• If you don’t contract with Google for your use of our measurement products, you should seek advice from the parties with whom you contract.
Updated EU User Consent Policy
Per our advertising features policy, both Google Analytics and Analytics 360 customers using advertising features must comply with Google’s EU User Consent Policy. Google's EU User Consent Policy is being updated to reflect new legal requirements of the GDPR. It sets out your responsibilities for making disclosures to, and obtaining consent from, end users of your sites and apps in the EEA.
Action: Even if you are not based in the EEA, please consider together with your legal department or advisors, whether your business will be in scope of the GDPR when using Google Analytics and Analytics 360 and review/accept the updated data processing terms as well as define your path for compliance with the EU User Consent Policy.
Find Out More
You can refer to privacy.google.com/businesses to learn more about Google’s data privacy policies and approach, as well as view our data processing terms.
We will continue to share further information on our plans in the coming weeks and will update relevant developer and help center documentation where necessary.
Thanks,
The Google Analytics Team6 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Www and non www versions of the site: 301 redirects but I still get impressions on the wrong version
hallo, I moved from www.bastabollette.it to bastabollette.it, setting a 301 redirect. If I check google search console, I still get impressions and looks like all old www pages are stille indexed. (see attached) why? how can I fix this? thank you
Reporting & Analytics | | micvitale0 -
Analytics code removed and still collecting data
Google analytics code was removed from a website and then it started tracking a couple of days later to only stop again? How can that happen? Has the developer not removed the old code properly? Can the code be injected remotely?
Reporting & Analytics | | GardenBeet0 -
Duplicate content and ways to deal with it.
Problem I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images. http://i.imgur.com/OXnPp.png Solutions: 1) Quick: Change the link on the pages above to be lowercase 2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example: http://www.darden.virginia.edu/MBA" /> ''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.'' 3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you. What do you all think?????? OXnPp voJdp.png OXnPp.png
Reporting & Analytics | | Darden0 -
Is Google able to determine duplicate content every day/ month?
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site). Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday. Have you seen or heard something similar?
Reporting & Analytics | | Martijn_Scheijbeler0 -
Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hi Friends, This issue is crimping my analytics efforts and I really need some help. I just don't trust the analytics data at this point. I don't know if my problem should be called duplicate content or what, but the SEOmoz crawler shows the following URLS (below) on my nonprofit's website. These are all versions of our main landing pages, and all google analytics data is getting split between them. For instance, I'll get stats for the /camp page and different stats for the /camp/ page. In order to make my report I need to consolidate the 2 sets of stats and re-do all the calculations. My CMS is looking into the issue and has supposedly set up redirects to the pages w/out the trailing slash, but they said that setting up the "ref canonical" is not relevant to our situation. If anyone has insights or suggestions I would be grateful to hear them. I'm at my wit's end (and it was a short journey from my wit's beginning ...) Thanks. URL www.enf.org/camp www.enf.org/camp/ www.enf.org/foundation www.enf.org/foundation/ www.enf.org/Garden www.enf.org/garden www.enf.org/Hante_Adventures www.enf.org/hante_adventures www.enf.org/hante_adventures/ www.enf.org/oases www.enf.org/oases/ www.enf.org/outdoor_academy www.enf.org/outdoor_academy/
Reporting & Analytics | | DMoff0