Old content with trailing `/` - What should be my new approach?
-
Dear team Moz,
I'm investigating SEO issues for the site that dropped rankings over a period of 4-6 months; after conversion from old platform (xenForo) to new custom developed platform.
The old version of the site was a simple xenForo based forum; with threads having standard url structure as like
www.domain.com/threads/thread-title.{thread_id}/
. Notice the trailing slash.We chose to keep the URLs intact during conversion to new platform; however the site still lost rankings. I'm sure there could be multiple reasons for it - but I wish to know if I should adjust the URLs -
1. By 301 redirecting all the URLs with trailing
/
to the URLs without/
.2. Leave the URLs as they were.
I must also mention that the new site has several new sections; and the old forum is just one part of it. The rest of the site follows URLs without trailing
/
- as it's the recommended URL structure by Google.I'd really appreciate your suggestions on this.
-
Thank you for your response @Optimal_Strategies. Of course, the right way to get rid of all the trailing slashes throughout the site is to enforce the rule in NGINX configuration.
However, the real question we have is the right strategy -
1. Whether we should leave existing content on the site with trailing "/" and continue building rest of the site with URLs without trailing slashes.
OR
2. Enforce non-trailing slashes site-wide and implement a 301 redirect to all the OLD urls to new urls.
-
From what I know, most platforms have trailing removing plugin or module, so it's worth trying this first. If you won't manage to do that way, do 301 redirects but this might cause issue if "/" url's will still be created. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Launching a new website. Old inherited site cannot be saved after lifted penalty. When should we kill the old site and how?
Background Information A website that we inherited was severely penalized and after the penalty was revoked the site still never resurfaced in rankings or traffic. Although a dramatic action, we have decided to launch a completely new version of the website. Everything will be new including the imagery, branding, content, domain name, hosting company, registrar account, google analytics account, etc. Our question is when do we pull the plug on the old site and how do we go about doing it? We had heard advice that we should make sure we run both sites at the same time for 3 months, then deindex the old site using a noindex meta robots tag.We are cautious because we don't want the old website to be associated in any way, shape or form with the new website. We will purposely not be 301 redirecting any URLs from the old website to the new. What would you do if you were in this situation?
Intermediate & Advanced SEO | | peteboyd0 -
Bingpreview/1.0b Useragent Using Adding Trailing Slash to all URLs
The Bingpreview crawler, which I think exists in order to take snapshots of mobile friendly pages, crawled my pages last night for the first time. However, it is adding a trailing slash to the end of each of my dynamic pages. The result is my program is giving the wrong page--my program is not expecting a trailing slash at the end of the urls. It was 160 pages, but I have thousands of pages it could do this to. I could try doing a mod rewrite but that seems like it should be unnecessary. ALL the other crawlers are crawling the proper urls. None of my hyperlinks have the slash on the end. I have written to Bing to tell them of the problem. Is anyone else having this issue? Any other suggestions for what to do? The user agent is: Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 BingPreview/1.0b
Intermediate & Advanced SEO | | friendoffood0 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Domaim.com/jobs?location=10 is indexed, so is domain.com/jobs/sheffield
Whats the best way you'd tackle that problem? I'm inheriting a website and the old devs had multiple internal links pointing to domain.com/jobs?location=10 (plus a ton of other numbers assigned to locations) and so they've been indexed. I usually use WMTs parameter tool but I'm not sure what the best approach would be other than that. Any help would be appreciated!
Intermediate & Advanced SEO | | jasondexter0 -
What to do when unique content is out of the question?
SEO companies/people are always stating that unique, quality content is one of the best things for SEO... But what happens when you can't do that? I've got a movie trailer blog and of late a lot of movie agencies are now asking us to use the text description they give us along with the movie trailer. This means that some pages are going to have NO unique content. What do you do in a situation like this?
Intermediate & Advanced SEO | | RichardTaylor0 -
Transfer link juice from old to new site
Hi seomozzers, The design team is building a new website for one of our clients. My role is to make sure all the link juice is kept. My first question is, should I just make 301s or is there another technique to preserve all the link juice from the old to new site that I should be focusing on? Second Question is that ok to transfer link juice using dev urls like www.dev2.example.com (new site) or 182.3456.2333? or should I wait the creation of real urls to do link juice transfer? Thank you 🙂
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Intermediate & Advanced SEO | | JHSpecialty0