Updating existing content - good or bad?
-
Hi All,
There are many situations where I encounter the need (or the wish) to update existing content.
Here are few reasons:
- Some update turned up on the subject that does not justify a new posy / article but rather just adding two lines.
- The article was simply poorly written yet the page has PR as it is a good subject and is online for quite some time (alternatively I can create a new and improved article and 301 the old one to the new).
- Improving titles and sub titles of old existing articles.
I would love to hear your thoughts on each of the reasons...
Thanks
-
Wikipedia updates content all the time and they seem to rank rather well.
From google's perspective they would rather rank up-to-date content, so yes its got to be a good idea to update. An old page might have links to it, and history with google, so if it had up to date content its got to be better than a brand new page.
-
In all the 3 cases mentioned in the post, this seems like it is a good idea not to create new posts/pages and update the existing one. Obviously if the article is poorly written so in that case one should update the page after fixing the content of it instead of creating new pages... same is the case for the other 2 scenarios.
I think this video by SEOmoz contains your answer >> http://www.seomoz.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more
Hope this helps!
-
- hi Fernando,
long time no see.
The site as a tool that is technically accurate however I just want to point out that if you don't have the tag obviously your link will not qualify but you don't need new hosting as it states here
Here's the example of a tagged link that was done appropriately
http://www.feedthebot.com/tools/if-modified/
here's an example of what happens when I put my homepage and with obviously no tag
Does your webpage support the If Modified Since HTTP header?
enter URL: example - www.feedthebot.comNo.
This website does not support the if modified since http header. Scroll down for details.Technical stuff:
This tool checked your HTTP headers and received this response ...
Server Response HTTP/1.1 200 OK
HTTP/1.1 200 OK
Server: WP Engine/1.2.0
Date: Thu, 02 May 2013 03:57:11 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
Keep-Alive: timeout=20
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Pragma: no-cache
X-Pingback: http://www.blueprintmarketing.com/xmlrpc.php
X-UA-Compatible: IE=Edge,chrome=1
X-Cacheable: SHORT
Vary: Accept-Encoding,Cookie
Cache-Control: max-age=600, must-revalidate
X-Cache: HIT: 13
X-Cache-Group: normal
X-Type: default
There does not appear to be a "last modified header response"Therefore, this tool has determined that this URL does not support if modified since.
Web hosts who do support If Modified Since...
We use and recommend using BlueHost for your hosting needs -
here is some more information on if modified since
http://www.seomoz.org/q/is-the-if-modified-since-http-header-still-relevant
it seems you want to pay a lot of attention when implementing it to the clock on the server as well as on the actual workstation.
http://redmine.lighttpd.net/boards/2/topics/1999
http://trac.nginx.org/nginx/ticket/93
I hope this is of help,
Tom
-
If you are just updating the title, or rewriting the content, then I would go with the same page instead of creating a new one.
IF-MODIFIED-SINCE is the way of telling spiders that the content has/hasn't changed. You can read more here: http://www.feedthebot.com/ifmodified.html
-
Actually does sound familiar somehow even though I know most people are creating new post stating about the change and point to the old one (if there is enough to cover).
What about poorly written articles? Improving titles?
Please explain what you mean by "IF-MODIFIED-SINCE"?
Thanks
-
Matt Cutts from Google pointed out in a WH video that you should update instead of creating new pages with only the updates.
You can point in the old page that the content was updated using "IF-MODIFIED-SINCE".
I can't find the video right now, but I am sure he did say that
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would changing permalink structure of 7,500 articles be good or bad?
Morning everyone, I'm the tech at a large men's lifestyle publisher and we're currently running the old /year/month/ URL structure in Wordpress. Now I've read countless articles about pro's and con's of month date vs post type formats (/2016/06/sample-post/ vs /sample-post/) and considering we produce both evergreen and daily news content we're stuck with making a decision. Currently we receive about 10,000 organic referrals per day (has been stuck at this for 12 months) but considering we have 7,500 articles, have 10 full-time staff and have been around for close to 7 years we think we're underperforming. Now providing we 301 redirect every old article to the new structure is there any other reason not to do this change? Any advice would be appreciated. Axps36D
Intermediate & Advanced SEO | | lucwiesman0 -
Is a .tv domain good for video optimization?
Does anybody know of any information or resources that point out .tv domains are helpful for video optimization? Thanks.
Intermediate & Advanced SEO | | RosemaryB0 -
Does Google see this as duplicate content?
I'm working on a site that has too many pages in Google's index as shown in a simple count via a site search (example): site:http://www.mozquestionexample.com I ended up getting a full list of these pages and it shows pages that have been supposedly excluded from the index via GWT url parameters and/or canonicalization For instance, the list of indexed pages shows: 1. http://www.mozquestionexample.com/cool-stuff 2. http://www.mozquestionexample.com/cool-stuff?page=2 3. http://www.mozquestionexample.com?page=3 4. http://www.mozquestionexample.com?mq_source=q-and-a 5. http://www.mozquestionexample.com?type=productss&sort=1date Example #1 above is the one true page for search and the one that all the canonicals reference. Examples #2 and #3 shouldn't be in the index because the canonical points to url #1. Example #4 shouldn't be in the index, because it's just a source code that, again doesn't change the page and the canonical points to #1. Example #5 shouldn't be in the index because it's excluded in parameters as not affecting page content and the canonical is in place. Should I worry about these multiple urls for the same page and if so, what should I do about it? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Faceted Navigation and Dupe Content
Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B
Intermediate & Advanced SEO | | bjs20100 -
Launching a new site with old, new and updated content: What’s best practice?
Hi all, We are launching a new site soon and I’d like your opinion on best practice related to its content. We will be retaining some pages and content (although the URLs might change a bit as I intend to replace under-scores with hyphens and remove .asp from some extensions in order to standardise a currently uneven URL structuring). I will also be adding a lot of new pages with new content, along with amend some pages and their content (and amend URLs again if need be), and a few pages are going to be done away with all together. Any advice from those who’ve done the same in the past as to how best to proceed? Does the URL rewriting sound OK to do in conjunction with adding and amending content? Cheers, Dave
Intermediate & Advanced SEO | | Martin_S0 -
Proving Bad Intent
Okay, so based on common sense re: author name and generic comment... ...I'm pretty sure this blog comment awaiting approval is aimed at getting users to a phony site in hopes they will make a donation to a fraudster impersonating Johns Hopkins. But if you check out the URL, you'll see they are not idiots. It's an .edu address with a high DA. Two questions: Are my suspicions well founded? How would I go about proving this, in a less clear cut case? Author : how to grow weed (IP: 173.208.91.231 , 173-208-91-231.ipvnow.com)
Intermediate & Advanced SEO | | DanielFreedman
E-mail : Diekema@gmail.com
URL : http://apps.pathology.jhu.edu/blogs/pancreas/?p=121
Whois : http://whois.arin.net/rest/ip/173.208.91.231
Comment:
After study a few of the blog posts on your website now, and I truly like your way of blogging. I bookmarked it to my bookmark website list and will be checking back soon. Pls check out my web site as well and let me know what you think0 -
Are dropdown menus bad for SEO
I have an ecommerce shop here: http://m00.biz/UHuGGC I've added a submenu for each major category and subcategory of items for sale. There are over 60 categories on that submenu. I've heard that loading this (and the number of links) before the content is very bad for SEO. Some will place the menu below the content and use absolute positioning to put the menu where it currently is now. It's a bit ridiculous in doing things backwards and wondering if search engines really don't understand. So the question is twofold: (1) Are the links better in a bottom loading sidemenu where they are now? (2) Given the number of links (about 80 in total with all categories and subcategories), is it bad to have the sidemenu show the subcategories which, in this instance, are somewhat important? Should I just go for the drilldown, e.g. show only categories and then show subcategories after? Truth is that users probably would prefer the dropdown with all the categories and second level subcategories, despite the link number and placement.
Intermediate & Advanced SEO | | attorney1 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840