Categories where "freshness" is of importance
-
I know that within the past couple of months, Google as made algo updates so that freshness of content is used as more of an indicator for relevancy, and hence, rankings.
see:
http://insidesearch.blogspot.com/2012/06/search-quality-highlights-39-changes.html
I understand that freshness is important across the board, but it is obviously more of a factor for certain search terms. My questions is, how can you determine if your product category (ecommerce) is one where freshness is becoming more of a factor? Is there any way to know which terms are considered to require fresher results?
Any input is appreciated.
-
Hello again,
I don't have much insight on this one, but I can share a personal experience that I think is relevant. I launched an Atlanta, GA based printing website about three months ago, and due to some pre-launch SEO efforts, ranked fairly well after the initial index.
Approximately six weeks later, after a "live beta test," my team decided to upgrade the CMS (Magento), and redesign the site to add some functionalities that were missing or buggy. The site was "Under Construction" for about three days, and our rankings increased slightly after the new site was indexed, despite it having less content (products) than the previous version of the site.
Recently (about three weeks ago), we added several more products, and our rankings increased dramatically (Google - 52 improved, 0 declined in SEOMoz rank tracking, 4x increase in queries, 2.5x increase in traffic).
These updates did however coincide with other SEO efforts, so it's hard to nail down what cause the improved metrics.
But... I definitely think that the addition of new content helped. In my market (Atlanta Printing) many of my competitor's websites have been updated very little over the last several months or even years, so it doesn't require much to win that battle. In other markets, this will of course be a different story. I do think freshness of content will impact any search result, like you said, and it absolutely can't hurt to have the "freshest" site in any given market. Again, depending on the search term, fresh content could mean three days old, or it could mean 3 months old, but I advise my clients to publish new or updated content at least every 30 days.
I think it all boils down to the competitiveness of the query and the rate at which other pages competing for that query are publishing fresh content.
Thanks!
Anthony
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where does Google finds "Soft 404" and "Not found" links?
Hi all, We can see very old links or anonymous links of website suddenly listing under soft 404 or 404 in GSW. As per Google, some of them are some script generated ignorable links. Other are actually the ones which were deleted but not redirected. I wonder how Google get these years old links even though there are no source links available for these. These must be fixed even though they are not linked anywhere from our internal or external pages? Thanks
Algorithm Updates | | vtmoz0 -
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
Do I need to track my rankings on the keywords "dog" and "dogs" separately? Or does Google group them together?
I'm creating an SEO content plan for my website, for simplicity's sake lets say it is about dogs. Keeping SEO in mind, I want to strategically phrase my content and monitor my SERP rankings for each of my strategic keywords. I'm only given 150 keywords to track in Moz, do I need to treat singular and plural keywords separately? When I tried to find estimated monthly searches in Google's keyword planner, it is grouping together "dog" and "dogs" under "dogs"... and similarly "dog company" and "dog companies" under "dog companies". But when I use Moz to track my rankings for these keywords, they are separate and my rankings vary between the plural version and singular version of these words. Do I need to track and treat these keywords separately? Or are they grouped together for SEO's sake?
Algorithm Updates | | Fairstone0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Category Containing a Product searched shows up higher in google then the product page itself?
Hello Moz Wizards, We have recently launched a new eCommerce website www.memoky.com and think we did a pretty good job with the markup structure for feeding the hungry google bot all information available about a the products. However google doesn't like us very much : ( It seems every time you google a product that we carry; the category pages that contain that product will show up, but the product page itself does not. Below are two examples, however this seems to be site-wide which makes me feel like there is an underlying issue that we are missing. Examples
Algorithm Updates | | Memoky
when searched for "Eduardo floor lamp - matt black/matt yellow shade"
Shows ups - http://www.memoky.com/lighting/floor-lamps.html
Does not - http://www.memoky.com/eduardo-floor-lamp-matt-black-matt-yellow-shade.html when searched for "Derrick arm chair - white leather/ walnut"
Shows ups - http://www.memoky.com/living/lounge-chairs.html_
Does not - http://www.memoky.com/derrick-arm-chair-white-leather.html_ that is the pattern for almost all the products on this site. Any thoughts on why this could be the case?0 -
Does having a few URLs pointing to another url via 301 "create" duplicate content?
Hello! I have a few URLs all related to the same business sector. Can I point them all at my home domain or should I point them to different relevant content within it? Ioan
Algorithm Updates | | IoanSaid1 -
Why Google loves MOZ for "Directory Submmission Service" ?
I have just for "directory submission service" in Google.com ( Geo Location USA ). I got two results from moz community for same thread. Does Google don't understand 301 redirect from seomoz.org to moz.com ? What about Domain Clustering ? PFA: SERP Screenshot kn8evtt.png
Algorithm Updates | | SanketPatel0 -
Why am i seeing a "conduit" line for search engine sources in Google Analytics ?
Among Google, Yahoo, Bing etc... One of the line is "Conduit". I never heard about this engine but, accordingly to Google Analytics metrics, it is the engine that bring the best traffic to my site in terms of pages per visit.
Algorithm Updates | | betadvisor0