How does Google determine freshness of content?
-
With the changes in the Google algorithm emphasizing freshness of content, I was wondering how they determine freshness and what constitutes new content. For instance, if I write a major update to a story I published last July, is the amended story fresh? Is there anything I can do in addition to publishing brand new content to make Google sure they see all my new content?
-
Rand and iPullRank just did a great WBF on the topic:
http://www.seomoz.org/blog/googles-freshness-update-whiteboard-friday
-
Google will find new content on your site if it is new pages or revised pages.
So, do what will work best for your visitors and that usually works best for google too.
Your visitors are the ones who send signals to google by their actions on your site, by likes, tweets, links, bookmarks and more.
Build a great site and Google will usually like it.
-
I think there is one thing that overcomes freshness of content: regularly updated content. So it is not enough just to write a single page about a topoic and publish it and wait for google to index. It is considered to be fresh but it is not updated and probably will not be able to make it to the top. Look at newsportals on the other hand: they are always publishing and publishing and getting indexed several times a day.
So the consequence is that it is not enough to have fresh content once a year, but you need to have regular fresh content - the more the better. This way you are getting indexed frequently, so you don't need to worry about google finding fresh material, and your rankings will improve as well as google sees your site is up to date, it always contains the latest news (you would not like to learn form a 10 year old seo book neither).
Two things you can do besides regular writing to get indexed more often is to implement google site search as your search engine inside your site and to keep your pages as light as possible. Google always spends a planned amount os time on your site. It can make a big difference if google can index 5 or 10 pages in that given amount of time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo Targeting Content Question
Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please? From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks
Technical SEO | | TheMuffinMan0 -
Dulpicate Content being reported
Hi I have a new client whose first MA crawl report is showing lots of duplicate content. The main batch of these are all the HP url with an 'attachment' part at the end such as: www.domain.com/?attachment_id=4176 As far as i can tell its some sort of slide show just showing a different image in the main frame of each page, with no other content. Each one does have a unique meta title & H1 though. Whats the best thing to do here ? Not a problem and leave as is Use the paremeter handling tool in GWT Canonicalise, referencing the HP or other solution ? Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
Duplicate Content Problems
Hi I am new to the seomoz community I have been browsing for a while now. I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation. We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs. Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions? Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
Technical SEO | | mikecrib10 -
Google is changing the title
Hi! Lately i have seen that Google changed the page titles for some clients, not all... its about 30% of them. For example, the title looks likes this after the Google change: Company name: SEO and Pay per click management But in on that page it looks like this: SEO and Pay per click management - Company name Does anyone know why?
Technical SEO | | DanielNordahl1 -
Google not showing my website ?
The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan
Technical SEO | | sherohass0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
How to get rid of duplicate content
I have duplicate content that looks like http://deceptionbytes.com/component/mailto/?tmpl=component&link=932fea0640143bf08fe157d3570792a56dcc1284 - however I have 50 of these all with different numbers on the end. Does this affect the search engine optimization and how can I disallow this in my robots.txt file?
Technical SEO | | Mishelm1 -
Google & Separators
This is not a question but something to share. If you click on all of these links and compare the results you will see why _ is not a good thing to have in your URLs. http://www.google.com/search?q=blue http://www.google.com/search?q=b.l.u.e http://www.google.com/search?q=b-l-u-e http://www.google.com/search?q=b_l_u_e http://www.google.com/search?q=b%20l%20u%20e If you have any other examples of working separators please comment.
Technical SEO | | Dan-Petrovic3