Better to publish regular new pricelist articles or update the existing ones ?
-
Hello Moooooooooooooz !
I could not sleep yesterday because of a SEO nightmare ! So I came up with the following question:
"Is it better to release regular new articles or update the existing ones" I explain more.
Our company release regular pricelists (every month new pricelists available for a month, with the same brands. ex: January pricelist for brand A, etc.)
Right now those pricelists are ranking good on google.
So I wondered: Would it better to do:
- Make the pricelist articles stronger: Our company - Brand A pricelist (title) blog/offer/brand-A-pricelist.html (url) -> every month I update the text. So I just have one article /link to work on
- **Make more content on the pricelist: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A-pricelist-january.html (url) -> So google keeps indexing new fresh content
- **Work on a extra category: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A/pricelist-january.html (url) -> So I work on one link over the web blog/offer/brand-A where Google finds lots of new relevant contents
I know that Matt Cutts said it's good to udpate an old article but in this case it's a bit different. Has anyone experiment the same ?
Tks a lot !
-
Tks for your help ! I'll try different option to see what's the best !
-
More is not always better. If the "more" information they are able to index involves outdated price lists from months or years ago it would likely do more harm than good.
-
But in this case we'll get less information indexed into google.
In our case we release pricelist with Part Number + description + Price
This is why I'm a bit lost
-
Hello fupfac,
I would go with one evergreen piece of content in this case in order to consolidate ranking signals, develop more trust over time, allow visitors to use their bookmark of the page for more than a month, not have to update existing links, etc...
I would also 301 redirect the old price lists to the new, evergreen one.
-
Hello !
Tks I'm actually kind of using the 1) but think the 2) will be smarter. Not so easy to decide ... I do think the 2) will be the best but in this case I'll have to always update the creation date.
I'm using a joomla blog to display it actually.
-
Customer experience is a big factor here. Will it create customer service problems if you get buyers working from outdated price lists. A number of outdated price lists indexed could well cause confusion.
Options:
-
Clearly indicate the validity of the price list and give users a link from every old price list to a single location for the newest price list. This should aggregate link juice to your latest price list.
-
Going forwards, I would just publish it in one place. That means one single page to optimise and update, as well as aggregate link juice. People won't link to your price lists if they're always going out of date. If it's in a single place, more people are likely to link in to them.
Just my approach. Without knowing the actual brand, rankings, keywords etc. it's difficult to be more precise at this stage. And I'm sure other SEOs make take a different point of view as well.
Just my point of view
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
One domain - Multiple servers
Can I have the root domain pointing to one server and other URLs on the domain pointing to another server without redirecting, domain masking or HTML masking? Dealing with an old site that is a mess. I want to avoid migrating the old website to the new environment. I want to work on a page by page and section by section basis, and whatever gets ready to go live I will release on the new server while keeping all other pages untouched and live on the old server. What are your recommendations?
Intermediate & Advanced SEO | | Joseph-Green-SEO0 -
Disavow post Penguin update
As recent Penguin update makes quick move with backlinks with immediate impact; does Disavow tool also results the changes in few days rather than weeks like earlier? How long does it take now to see the impact of disavow? And I think still we must Disavow some links even Google claim that it'll take care of bad backlinks without passing value from them?
Intermediate & Advanced SEO | | vtmoz0 -
Which one is better, a brand new subdomain or a second-level directory with PR 4
Hey, all SEOers! May I ask you a question about subdomain and second-level directory? Our website is about software, so we write many posts about how to use this software solve problems, and then use these posts to get ranks (we don't use the page of software to get ranks). And all the posts we wrote are listed under the second-lever directory, just like: www.xxx.com/support/ . But at this moment our boss want to list all the posts to the subdomain like support.xxx.com. By the way, the second-level directory is a page with PR 4, and the subdomain is brand new, even it doesn't exist now. So here is my question: should we list all the posts to support.xxx.com? If we choose to do like this, this will effect the speed of Google index, and we will take more time to build links for XXX.com and support.XXX.com? Any answer will be appreciated and thank you advance! to get rank instead of ranking the page of product,
Intermediate & Advanced SEO | | Vicky28850 -
Should I put rel=publisher on UGC?
My website has a main section that we call expert content and write for. We also have a community subdomain which is all user generated. We are a pretty big brand and I am wondering should the rel publisher tag just be for the www expert content, or should we also use it on the community UGC even though we don't directly write that?
Intermediate & Advanced SEO | | MarloSchneider1 -
.COM or .ORG - Which is better?
I work for a non-profit association. We currently use a .com as our primary, but also own the .org. Should we switch to the .org address? What would the benefits be?
Intermediate & Advanced SEO | | vpoffunk0 -
What Wordpress Update Services Should You Be Using on Your Wordpress Blog?
I have been told that pingomatic.com is all that you need however yesterday I went to a conference and others were recommending to have a good list of pinging services to cover all your bases Here are 4 that have been recommended: pingomatic technorati blogsearch.google.com feedburner Any others that should be included on this list? My goal is not to spam these ping lists however want to make sure my content is getting indexed quickly
Intermediate & Advanced SEO | | webestate0 -
Panda Updates - robots.txt or noindex?
Hi, I have a site that I believe has been impacted by the recent Panda updates. Assuming that Google has crawled and indexed several thousand pages that are essentially the same and the site has now passed the threshold to be picked out by the Panda update, what is the best way to proceed? Is it enough to block the pages from being crawled in the future using robots.txt, or would I need to remove the pages from the index using the meta noindex tag? Of course if I block the URLs with robots.txt then Googlebot won't be able to access the page in order to see the noindex tag. Anyone have and previous experiences of doing something similar? Thanks very much.
Intermediate & Advanced SEO | | ianmcintosh0