Updating content on URL or new URL
-
High Mozzers,
We are an event organisation. Every year we produce like 350 events. All the events are on our website.
A lot of these events are held every year. So i have an URL like
So what would you do. This URL has some inbound links, some social mentions and so on. SO if the event will be held again in 2013. Would it be better to update the content on this URL or create a new one.
I would keep this URL and update it because of the linkvalue and it is allready indexed and ranking for the desired keyword for that event.
Cheers,
Ruud
-
no we use hyphens. Just for the example. And thanks for your answer. I think 3.1 would be an good idea.
I thought just replacing the content would be good because then you refresh your content. You do not lose your link love and the event content would be very similar. We do not really want to rank for the old content. We want a visitor to come to the event page and register for the event.
Have to think about it a little while
-
While I'm here, do you not have hyphens as word separators in your URLs or is it just for these examples that you're not putting them in?
i.e. Why have you gone for www.domainname.nl/eventname2013 vs www.domainname.nl/event-name-2013?
-
Tough one these annual events, few paths you may want to consider.
**1) Create a new url - www.domainname.nl/event-name-2013 **
Reasonable idea if the event is searched by year i.e. they'll search "event name 2013". As you probably can't be sure about what people are going to do I'd suggest not relying on that and keeping the original URL. Make sure and link to all future years from here though (link to 2013, 2014 when it comes, etc.)
PROS - You'll now have a naming convention and never have to worry about this problem again You don't need to worry about what to do with last year's info You build up your site's relevancy for the term with multiple pages on the same topic
CONS - You lose any authority and link equity the main page has built up If the pages are highly similar you may have trouble ranking the newer ones (or older ones, I dunno how Google works it out)
2) Replace it - Simply put up the new content for 2013 and overwrite the 2012 content.
Not great for a number of reasons. Significantly changing the content may lose some of your relevancy and the archived content may still have value to users.
PROS - You get to keep the same URL and it will always be the most recent information (if you update it) You get to keep your authority and link equity (caveat: If the content changes entirely search engines may strongly devalue previous links to that page)
CONS - You lose content You may lose relevancy
3) You update the content with 2013's schedule and place the older content on a new page - http://www.domainname.nl/event-name-2012
This way you can keep working on the existing URL but don't lose the old content.
PROS - You build up your site's relevancy for the term with multiple pages on the same topic
CONS - You may confuse search engines by moving the content they expected to another page
3.1) Canonicalise the 2012 content
As above but you add a canonical tag to the 'archived' page telling search engines that the main page is the one they're looking for
PROS - Users still have access to the older content
CONS - The old content no longer counts for much
4) You add the new content to the main page and keep 2012's underneath
You could simply update the page with a
<header>
combo in HTML5 or demote the previous year's to
s and use
for this year. You can even somewhat hide the 2012 stuff by using css, jquery or js (maybe ajax, I dunno), that would mean that the page can still pretty much look like you want.
PROS - Adding more relevant content to a page can improve the pages quality All content accessible from one location for the user
CONS - If it is year specific you may dilute the relevancy Shouldn't be seen as hiding content, but if there's a lot of keyword heavy text in the hidden divs it may trigger sore sort of alert
What would I do? Depends on the event/type of site I guess. Most likely 3.1 or 4 but as I'm not 100% happy with what canonicalisation does, probably 4.
If anybody wants to jump in with other ideas or other pros and cons there's probably a lot I've not thought about.
</header>
-
No problem my friend and thanks for the explanation. If you are going to repeat the event then there is no point in creating a new page for it. You can just add the new event to the same page mentioned under a different year. So the point is, the URL should not change but the page gets updated with the new event's info. This is very good from SEO standpoint also as the page will be constantly updated with new content and you will still enjoy all the link love that it accumulated over a period of time.
Hope this helps.
Best regards,
Devanur Rafi.
-
Hi Rafi, that is correct what you are saying. But every event has its own page. The question is, if we repeat this event. What would you do. Create a new event page or update the old event page of that event.
Like we would have www.domainname.nl/searchlove (wish we had that event)
And we are going to repeat searchlove in 2013. Would you put all the new data of searchlove 2013 on www.domainname.nl/searchlove or would you create a new url www.domainame.nl/searchlove2013
Sorry if the question was or is a bit difficult to understand (it mainly because of my English)
-
Hi Ruud,
Straight into the meat. If you start adding all the events to the same page, then it would dilute the page's ranking capability as it would have to rank for multiple events (event names) and this is not recommended. So, the best thing for you to do would be, come up with individual even pages and let them rank for the specific event name. Doing this, you will not only be ranking well with even specific pages but also, the size of the website will also grow which is very good for your website going forward as the search engines like big websites with lot of unique content and there are better chances for big sites to become authoritative in the niche when compared to their smaller peers. Hope you got the point.
Good luck.
Regards,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Duplicate content
Hello mozzers, I have an unusual question. I've created a page that I am fully aware that it is near 100% duplicate content. It quotes the law, so it's not changeable. The page is very linkable in my niche. Is there a way I can build quality links to it that benefit my overall websites DA (i'm not bothered about the linkable page being ranked) without risking panda/dupe content issues? Thanks, Peter
Technical SEO | | peterm21 -
Affiliate Url & duplicate content
Hi i have checked passed Q&As and couldn't find anything on this so thought I would ask.
Technical SEO | | Direct_Ram
I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1 I cant seem to locate where these URLS are coming from and how this is being created? This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this? If anyone has any information on this it would be a great help. thanks E0 -
Site hit by algorthithmic update in October 2014 - filters and thin content queries.
Back in October 2014 last year, a site we are working with had a significant drop in organic traffic. This coincided with Google's algorithmic update. The side in question uses filters extensively and at the time did not have any canonical tags in place. The lions share of these filter pages had little or no written content just products. The website now has canonical tags throughout and content has started to be added to the top level categories and we will continue to add more, however there is still a large amount of pages with little or no content. Webmaster tools shows that there are large amounts of internal links (for instance 42,000+ to the homepage) which must be due to the filtered pages. I am looking for advice on what is the best way to proceed. Do I edit robots.txt, start adding no follow tags or something else entirely?
Technical SEO | | bfinternet0 -
Content on top-level-domain vs. content on subpage
Hello Seomoz community, I just built a new website, mainly for a single affiliate programm and it ranks really well at google. Unfortunately the merchant doesn’t like the name of my domain, that’s why I was thrown out of the affiliate program. So suppose the merchant is a computer monitor manufacturer and his name is “Digit”. The name of my domain is something like monitorsdigital.com at the moment. (It’s just an example, I don’t own this URL). The structure of my website is: 1 homepage with much content on it + a blog. The last 5 blog entries are displayed on the homepage. Because I got kicked out of the affiliate program I want to permanent redirect monitorsdigital.com to another domain. But what should the new website look like? I have two possibilities: Copy the whole monitorsdigital website to a new domain, called something like supermonitors.com. Integrate the monitorsdigital website into my existing website about different monitor manufacturers. E.g.: allmonitors.com/digit-monitors.html (that url is permitted by the merchant) What do you think is the better way? I just got the impression, that it seems to be a little easier to rank high with a top-level-domain (www.supermonitors.com) than with a subpage (www.allmonitors.com/digit-monitors.html). However the subpage can benefit from the domain authority, that was generated by other subpages. Thanks for your help and best regards MGMT
Technical SEO | | MGMT0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0