Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Looking to remove dates from URL permalink structure. What do you think of this idea?
-
I know most people who remove dates from their URL structure usually do so and then setup a 301 redirect. I believe that's the right way to go about this typically. My biggest fear with doing a global 301 redirect implementation like that across an entire site is that I've seen cases where this has sort of shocked Google and the site took a hit in organic traffic pretty bad.
Heres what I'm thinking a safer approach would be and I'd like to hear others thoughts. What if...
- Changed permalink structure moving forward to remove the date in future posts.
- All current URLs stay as is with their dates
- Moving forward we would go back and optimize past posts in waves (including proper 301 redirects and better URL structure). This way we avoid potentially shocking Google with a global change across all URLs.
Do you know of a way this is possible with a large Wordpress website? Do you see any conplications that could come about in this process? I'd like to hear any other thoughts about this please.
Thanks!
-
Hey Jeff,
thank you for your input. So you just globally changed the permalink structure, put global redirects in place and you didn't see permanent loss in trafic? And you did that on multiple sites?
If so I'll most probably follow your path.
Thanks again,
Julien
-
Hey Julien -
I wouldn't go this route. Since asking this question I have had dates removed from 30+ domains, many with 5-10 million+ pageviews per month. We haven't seen this as a risk and are now very in favor of removing dates from URLs on most sites we work with. We work with sites that have very evergreen content, and republishing is a very strong SEO strategy.
The process is very similar to moving your site to HTTPS from HTTP. Since Google has started recommending HTTPS we haven't seen any issue with removing dates as well.
Hope that helps
-
Hey Thomas,
Interesting thought! Could you go in a little more details as to how that regex would work? Would that randomize the redirects to only a portion of the posts?
Thanks!
Julien
-
I think only do 10% of pages watch them if you like what you see do the next 20%
RedirectMatch301^/([0-9]{4})/([0-9]{2})/(.*)$ http://yourwebsite.com/$3
-
Garrett -
I never got a clear answer, but I have since gone forward making changes on 20+ Wordpress blogs without any ill-effect. The changes we made were only to sites that had dates in the permalink structure and 301 redirects were put in place (on the server, not through a plugin). Trying to change the permalink structure going forward but not back was too much of a hassle. It appears Google sees this as a positive change for users because it cleans up the permalink structure and allows site owners to keep their content updated and continue sharing.
Not sure how this will apply in other scenarios such as removing folder structure (categories and tags) from the permalink, but I've had only positive results removing the dates. I work with some very high profile mom and food blogs so I have some pretty solid evidence and data supporting my decisions now.
I hope that helps. Cheers!
-
Hi Jeff,
Did you end up making these changes? How is it going? I found your post as I was researching and rethinking how to structure WordPress blog permalinks.
I have a few e-commerce clients with blog posts that are several years old and still popular in organic search. I'd like to turn some of them into evergreen content that is regularly updated, but I feel like we should do something about the permalinks first.
There are some great insights here. Thank you to all who contributed.
Garrett
-
No problem, glad to help! Best of luck with whichever route you go with!
-
It was worth a shot. Thanks for sharing your thoughts. Cheers!
-
Unfortunately, I don't have any examples for ya. Never come across this particular topic for a client.
-
Know of any site that has used the canonical to do anything like this? It seems like the safest option, I just haven't seen this to this scale is all.
-
Yes, I'm saying you should keep URLs as they are. I'm always an advocate for not changing URL structure unless there's a really good, highly beneficial reason for doing so. I don't know of a way to change only new URL structures while keeping old ones the same, but I'm no WP expert.
-
Although I haven't strongly considered that approach, it did cross my mind to utilize the canonical. Do you know of any way to change WordPress permalink structure going forward but not backwards? Or are you suggesting we keep the dates in the URL going forward? I just think that eventually we'll have to think about updating that URL structure.
-
OK, now that I understand the reasoning...
I believe there's a better, less-risky approach. What I would do is write a completely new post based on information from the old post. At the same time you publish the new post, go back to the old version and add these two things: a canonical tag pointing to the new version, and a bit of _very readable _text at the top linking to the new post. Something like "Hey, thanks for your interest in our content. Feel free to read on, but we thought you should know we've updated this post which can be found here: link"
This accomplishes a few important things. It eliminates the need for a risky project that could affect your entire site just for the ability to update posts (which I'm guessing doesn't happen too often, what percent of posts get updated?). The canonical tag removes the dupe content risk so you're not cannibalizing your own content. And leaving the old post there gives people the opportunity to discover old content that, while possibly not relevant anymore, still demonstrates you've been a trustworthy source of information for a long time.
-
Logan,
By not being able to remove the dates we're not able to go back to a 5-year old post, make updates, and then republish the content. This is a "mom blog" and the topics can be recycled, but if we create a new post that we also covered 5 years ago we would be competing with ourself instead of using something that already has some authority and rank to it.
That's why we were thinking to somehow make it possible (in WordPress) to keep all current URLs as is, change the permalink structure moving forward so that future posts don't have date, and then be able to update posts as we go and 301 them manually over time. Does that make sense?
I agree with your last 2 statements, it is a HUGE risk to 301 this entire site to do away with those dates. Even though redirects supposedly pass all link juice we all know that a big change like that across an entire site could have ill-effect with search engines.
I'd like to know if anyone has gone about the URL structure change like I'm outlining here. Am I crazy to think that is a logical way to go about it? I haven't been able to find anywhere that someone has done this though.
-
Jeff,
Based on the traffic you say this blog gets, I'm assuming its rather large and has hundreds, if not thousands of posts. Which leads me to one simple question:
Why? This seems like a HUGE amount of risk and a pretty decent amount of work to go into something that's really not going to provide any benefit.
*edit: It should also be noted that just because Google has recently stated that redirects now pass all link juice doesn't mean you should go needlessly add a massive amount of redirects. There are other implications that redirects have, like load time for example. If you have 1,000 redirects, every single one of those is going to be checked before any page on your site loads, which takes a lot of time.
-
Thanks for your response. I actually agree with most, if not all of what you are saying.
The problem is that this is a larger blog with 5-7 million page views on average per month. 1 million+ just from organic. I agree with your statement about postponing and never getting done. With a large blog I still think it would be easier (less stressful, not necessarily easier) to manage it in waves in order to pause or correct when there is a larger than normal dip that maybe doesn't come back up. With a business it makes sense, but with these bloggers sites it seems like too big of a risk when it's what brings in almost all the income. Does that make sense?
That tweet you're referring to, I thought that was mainly in regard to HTTP to HTTPS migrations. I need to look more into that I guess.
Thanks!
-
I'm not a fan of your plan.
There can be many reasons why a site might "take a hit". For example, if page-to-page redirects were not implemented or the sitemap was not updated, updated correctly, or resubmitted to search engines. I wouldn't assume that will happen in your case. In my experience, if the transition is done correctly and there's a hit, it's short-lived.
If you're thinking the redirects will cause you to lose SEO equity, that is no longer the case. Gary Illyes, a Google webmaster trends analyst, tweeted on July 26, 2016 "30x redirects don’t lose PageRank anymore."
One of the biggest risks (in my mind) of staging the migration the way you suggest is that the "waves" never happen. I see that a lot - a situation where an organization agrees to postpone work to a future date that never arrives. New and competing priorities take precedence resulting in an endless postponement. If you have the management commitment, funding and resources to do the work now, I say bite the bullet and go for it. Make a plan. Stick to it. Check and double check your work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
Where to put a page ID in a URL?
Hello, My company is going to change URLs to example.com/category or example.com/product. When we will change the URLs to product or category pages somehow we have to check whether the requested page is from category table in DB or from products table (this gives much speed to page load time). So we have to choose how to make the different product and category pages.
Intermediate & Advanced SEO | | komeksimas
Programmers said that we need to insert id to URL. So the question is: Which is the better way to place an id to an URL? example.com/product-name?id=111 example.com/product-name/111 example.com/product_name-111 Or maybe we should use some other punctuation mark to separate id from product name? p.s. I have read Dynamic URLs vs. static URLs by Google and it still didn't answered which is the best for all of the pages. Somehow others solve this problem by typing only the names to the URL, but could anyone tell what that technology should be?0 -
Removing Dynamic "noindex" URL's from Index
6 months ago my clients site was overhauled and the user generated searches had an index tag on them. I switched that to noindex but didn't get it fast enough to avoid being 100's of pages indexed in Google. It's been months since switching to the noindex tag and the pages are still indexed. What would you recommend? Google crawls my site daily - but never the pages that I want removed from the index. I am trying to avoid submitting hundreds of these dynamic URL's to the removal tool in webmaster tools. Suggestions?
Intermediate & Advanced SEO | | BeTheBoss0