Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Looking to remove dates from URL permalink structure. What do you think of this idea?
-
I know most people who remove dates from their URL structure usually do so and then setup a 301 redirect. I believe that's the right way to go about this typically. My biggest fear with doing a global 301 redirect implementation like that across an entire site is that I've seen cases where this has sort of shocked Google and the site took a hit in organic traffic pretty bad.
Heres what I'm thinking a safer approach would be and I'd like to hear others thoughts. What if...
- Changed permalink structure moving forward to remove the date in future posts.
- All current URLs stay as is with their dates
- Moving forward we would go back and optimize past posts in waves (including proper 301 redirects and better URL structure). This way we avoid potentially shocking Google with a global change across all URLs.
Do you know of a way this is possible with a large Wordpress website? Do you see any conplications that could come about in this process? I'd like to hear any other thoughts about this please.
Thanks!
-
Hey Jeff,
thank you for your input. So you just globally changed the permalink structure, put global redirects in place and you didn't see permanent loss in trafic? And you did that on multiple sites?
If so I'll most probably follow your path.
Thanks again,
Julien
-
Hey Julien -
I wouldn't go this route. Since asking this question I have had dates removed from 30+ domains, many with 5-10 million+ pageviews per month. We haven't seen this as a risk and are now very in favor of removing dates from URLs on most sites we work with. We work with sites that have very evergreen content, and republishing is a very strong SEO strategy.
The process is very similar to moving your site to HTTPS from HTTP. Since Google has started recommending HTTPS we haven't seen any issue with removing dates as well.
Hope that helps
-
Hey Thomas,
Interesting thought! Could you go in a little more details as to how that regex would work? Would that randomize the redirects to only a portion of the posts?
Thanks!
Julien
-
I think only do 10% of pages watch them if you like what you see do the next 20%
RedirectMatch301^/([0-9]{4})/([0-9]{2})/(.*)$ http://yourwebsite.com/$3
-
Garrett -
I never got a clear answer, but I have since gone forward making changes on 20+ Wordpress blogs without any ill-effect. The changes we made were only to sites that had dates in the permalink structure and 301 redirects were put in place (on the server, not through a plugin). Trying to change the permalink structure going forward but not back was too much of a hassle. It appears Google sees this as a positive change for users because it cleans up the permalink structure and allows site owners to keep their content updated and continue sharing.
Not sure how this will apply in other scenarios such as removing folder structure (categories and tags) from the permalink, but I've had only positive results removing the dates. I work with some very high profile mom and food blogs so I have some pretty solid evidence and data supporting my decisions now.
I hope that helps. Cheers!
-
Hi Jeff,
Did you end up making these changes? How is it going? I found your post as I was researching and rethinking how to structure WordPress blog permalinks.
I have a few e-commerce clients with blog posts that are several years old and still popular in organic search. I'd like to turn some of them into evergreen content that is regularly updated, but I feel like we should do something about the permalinks first.
There are some great insights here. Thank you to all who contributed.
Garrett
-
No problem, glad to help! Best of luck with whichever route you go with!
-
It was worth a shot. Thanks for sharing your thoughts. Cheers!
-
Unfortunately, I don't have any examples for ya. Never come across this particular topic for a client.
-
Know of any site that has used the canonical to do anything like this? It seems like the safest option, I just haven't seen this to this scale is all.
-
Yes, I'm saying you should keep URLs as they are. I'm always an advocate for not changing URL structure unless there's a really good, highly beneficial reason for doing so. I don't know of a way to change only new URL structures while keeping old ones the same, but I'm no WP expert.
-
Although I haven't strongly considered that approach, it did cross my mind to utilize the canonical. Do you know of any way to change WordPress permalink structure going forward but not backwards? Or are you suggesting we keep the dates in the URL going forward? I just think that eventually we'll have to think about updating that URL structure.
-
OK, now that I understand the reasoning...
I believe there's a better, less-risky approach. What I would do is write a completely new post based on information from the old post. At the same time you publish the new post, go back to the old version and add these two things: a canonical tag pointing to the new version, and a bit of _very readable _text at the top linking to the new post. Something like "Hey, thanks for your interest in our content. Feel free to read on, but we thought you should know we've updated this post which can be found here: link"
This accomplishes a few important things. It eliminates the need for a risky project that could affect your entire site just for the ability to update posts (which I'm guessing doesn't happen too often, what percent of posts get updated?). The canonical tag removes the dupe content risk so you're not cannibalizing your own content. And leaving the old post there gives people the opportunity to discover old content that, while possibly not relevant anymore, still demonstrates you've been a trustworthy source of information for a long time.
-
Logan,
By not being able to remove the dates we're not able to go back to a 5-year old post, make updates, and then republish the content. This is a "mom blog" and the topics can be recycled, but if we create a new post that we also covered 5 years ago we would be competing with ourself instead of using something that already has some authority and rank to it.
That's why we were thinking to somehow make it possible (in WordPress) to keep all current URLs as is, change the permalink structure moving forward so that future posts don't have date, and then be able to update posts as we go and 301 them manually over time. Does that make sense?
I agree with your last 2 statements, it is a HUGE risk to 301 this entire site to do away with those dates. Even though redirects supposedly pass all link juice we all know that a big change like that across an entire site could have ill-effect with search engines.
I'd like to know if anyone has gone about the URL structure change like I'm outlining here. Am I crazy to think that is a logical way to go about it? I haven't been able to find anywhere that someone has done this though.
-
Jeff,
Based on the traffic you say this blog gets, I'm assuming its rather large and has hundreds, if not thousands of posts. Which leads me to one simple question:
Why? This seems like a HUGE amount of risk and a pretty decent amount of work to go into something that's really not going to provide any benefit.
*edit: It should also be noted that just because Google has recently stated that redirects now pass all link juice doesn't mean you should go needlessly add a massive amount of redirects. There are other implications that redirects have, like load time for example. If you have 1,000 redirects, every single one of those is going to be checked before any page on your site loads, which takes a lot of time.
-
Thanks for your response. I actually agree with most, if not all of what you are saying.
The problem is that this is a larger blog with 5-7 million page views on average per month. 1 million+ just from organic. I agree with your statement about postponing and never getting done. With a large blog I still think it would be easier (less stressful, not necessarily easier) to manage it in waves in order to pause or correct when there is a larger than normal dip that maybe doesn't come back up. With a business it makes sense, but with these bloggers sites it seems like too big of a risk when it's what brings in almost all the income. Does that make sense?
That tweet you're referring to, I thought that was mainly in regard to HTTP to HTTPS migrations. I need to look more into that I guess.
Thanks!
-
I'm not a fan of your plan.
There can be many reasons why a site might "take a hit". For example, if page-to-page redirects were not implemented or the sitemap was not updated, updated correctly, or resubmitted to search engines. I wouldn't assume that will happen in your case. In my experience, if the transition is done correctly and there's a hit, it's short-lived.
If you're thinking the redirects will cause you to lose SEO equity, that is no longer the case. Gary Illyes, a Google webmaster trends analyst, tweeted on July 26, 2016 "30x redirects don’t lose PageRank anymore."
One of the biggest risks (in my mind) of staging the migration the way you suggest is that the "waves" never happen. I see that a lot - a situation where an organization agrees to postpone work to a future date that never arrives. New and competing priorities take precedence resulting in an endless postponement. If you have the management commitment, funding and resources to do the work now, I say bite the bullet and go for it. Make a plan. Stick to it. Check and double check your work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Removing index.php
I have question for the community and whether or not this is a good or bad idea. I currently have a Joomla site that displays www.domain.com/index.php in all the URLs with the exception of the home page. I have read that it's better to not have index.php showing in the URL at all. Does it really matter if I have index.php in my URL? I've read that it is a bad practice. I am thinking about installing the sh404SEF component on my site and removing the index.php. However, I rank pretty high for the keywords I want in Google, Bing and Yahoo. All of the URLs that show up in the searches have index.php as part of the URL. Has anyone ever used sh404SEF to remove the index.php and how did you overcome not loosing your search engine links? I don't want an existing search showing www.domain.com/index.php/sales and it not linking to the correct page which would now be www.domain.com/sales. I guess I could insert the proper redirects in the htaccess file. But I was hoping to avoid having every page of my site in the htaccess file for redirecting. Any help or advice appreciated.
Intermediate & Advanced SEO | | MedGroupMedia0 -
What is the best URL structure for categories?
A client's site currently uses the URL structure: www.website.com/�tegory%/%postname% Which I think is optimised fairly well, as the categories are keywords being targeted. However, as they are using a category hierarchy, often times the URL looks like this: www.website.com/parent-category/child-category/some-post-titles-are-quite-long-as-they-are-long-tail-terms Best practise often dictates (such as point 3 in this Moz article) that shorter URLs are better for several reasons. So I'm left with a few options: Remove the category from the URL Flatten the category hierarchy Shorten post titles two a word or two - which would hurt my long tail search term traffic. Leave it as it is What do we think is the best route to take? Thanks in advance!
Intermediate & Advanced SEO | | underscorelive0 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
Product URL structure for a marketplace model
Hello All. I run an online marketplace start-up that has around 10000 products listed from around 1000+ sellers. We are a similar model to etsy/ebay in the sense that we provide a platform but sellers to list products and sell them. I have a URL structure question. I have read http://www.seomoz.org/q/how-to-define-best-url-structure-for-product-pages which seems to show everyone suggests to use Products: products/category/product-name Categories: products/category as the structure for product pages. Because we are a marketplace (our category structure has multiple tiers sometimes up to 3) our sellers choose a category for products to go in. How we have handled this before is we have used: Products: products/last-tier-category-chosen/product-name (eg: /products/sweets-and-snacks/fluffy-marshmallows) Categories: products/category (eg: /products/sweets-and-snacks) However we have two issues with this: The categories can sometimes change, or users can change them which means the links completely change and undo any link building work built up. The urls can get a bit long and am worried that the most important data (the fluffy marshmallow that reflects in the page title and content) is left till too late in the URL. As a result we plan to change our URL structure (we are going through a rebuild anyhow so losing old links is not an issue here) so that the new structure was: Products: products/product-name(eg: /products/fluffy-marshmallows) Categories: products/category (eg: /products/sweets-and-snacks) My concern about doing this however, and question here, is whether this willnegatively impact the "structure" of pages when google crawls our marketplace.Because "fluffy marshmallows" will no longer technically fit into the url structure of "sweets and snacks". I dont know if this would have a negative impact or not. FYI etsy (one of the largest marketplace models in the world) us the latter approach and do not have categories in product urls, eg: listing/42003836/vintage-french-industrial-inspired-side Any ideas on this? Many thanks!
Intermediate & Advanced SEO | | LiamPatterson0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0 -
URL Length or Exact Breadcrumb Navigation URL? What's More Important
Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn
Intermediate & Advanced SEO | | Romancing0