Slug best practices?
-
Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below.
Are there reasons why one might be better than the others?
http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/
-
I don't see that much differences in the URL structure of Hollywood Life and TMZ, their editors just chose the short or shorter variant for their news story (in most cases they don't have to worry about SEO). The Washington post is a pity as they have to add a couple of extra directories in their URL structure.
-
That's a very subjective question, so I'll just list some references that you may want to review: Google's SEO guide (see pages 8-9 for URLs), Ann Smarty's post at SEJ, Moz's guide to URL best practices, and an old Moz post by Rand Fishkin on good URLs. I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle deletion of a forum subdomain?
Hello All Our site www.xxxx.com has long had a forum subdomain forum.xxxx.com. We have decided to sunset the forum. We find that the 'Ask a Question' function on product pages and our social media presence are more effective ways of answering customers' product & project technical Qs. Simply shutting down the forum server is going to return thousands of 404s for forum.xxxx.com, which I can't imagine would be helpful for the SEO of www.xxxx.com even though my understanding is that subdomains are sort of handled differently than the main site. We really tremendously on natural search traffic for www.xxxx.com, so I am loathe to make any moves that would hurt us. I was thinking we should just keep the forum server up but return 410s for everything on it, including the roughly ~3,000 indexed pages until they are removed from the index, then shut it down. The IT team also gave the option of simply pointing the URL to our main URL, which sorta scares me because it would then 200 and return the same experience hitting it from forum.xxxx.com as www.xxxx.com, which sounds like a very bad idea. (Yes, we do have canonicals on www.xxxx.com). In your opinion, what is the best way to handle this matter? Thank You
Intermediate & Advanced SEO | | jamestown0 -
Advanced: SEO best practice for a large forum to minimise risk...?
Hi Hope someone can offer some insight here. We have a site with an active forum. The transactional side of the site is about 300 pages totals, and the forum is well over 100,000 (and growing daily) meaning the 'important' pages account for less that 0.5% of all pages on the site. Rankings are pretty good and we're ticking lots of boxes with the main site, with good natural links, logical architecture, appropriate keyword targeting. I'm worried about the following: crawl budget PR flow Panda We actively moderate the forum for spam and generally the content is good (for a forum anyway), so I'm just looking for any best practice tips for minimising risk. I've contemplated moving the forum to a subdomain so there's that separation, or even noindexing the forum completely, although it does pull in traffic. Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | iProspect_Manchester1 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
Changing domains - best process to use?
I am about to move my Thailand-focused travel website into a new, broader Asia-focused travel website. The Thailand site has had a sad history with Google (algorithmic, not penalties) so I don't want that history to carry over into the new site. At the same time though, I want to capture the traffic that Google is sending me right now and I would like my search positions on Bing and Yahoo to carry through if possible. Is there a way to make all that happen? At the moment I have migrated all the posts over to the new domain but I have it blocked to search engines. I am about to start redirecting post for post using meta-refresh redirects with a no-follow for safety. But at the point where I open the new site up to indexing, should I at the same time block the old site from being indexed to prevent duplicate content penalties? Also, is there a method I can use to selectively 301 redirect posts only if the referrer is Bing or Yahoo, but not Google, before the meta-refresh fires? Or alternatively, a way to meta-refresh redirect if the referrer is Google but 301 redirect otherwise? Or is there a way to "noindex, nofollow" the redirect only if the referrer is Google? Is there a danger of being penalised for doing any of these things? Late Edit: It occurs to me that if my penalties are algorithmic (e.g. due to bad backlinks), does 301 redirection even carry that issue through to the new website? Or is it left behind on the old site?
Intermediate & Advanced SEO | | Gavin.Atkinson0 -
Best way to move a page without 301
I have a page that currently ranks high for its term. That page is going away for the main website users, meaning all internal site links pointing to that page are going away and point to a new page. Normally you would just do a 301 redirect to the new URL however the old URL will still need to remain as a landing page since we send paid media traffic to that URL. My question is what is the best way to deal with that? One thought was set up a canonical tag, however my understanding is that the pages need to be identical or very close to the same and the landing page will be light on content and different from the new main page. Not topically different but not identical copy or design, etc.
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
Question about best approach to site structure
I am curious if anyone can share some advice. I am working on planning architecture for a tour company. The key piece of the content strategy will be providing details on each of the tour destinations, with associated profiles for each city within those destinations. Lots of content, which should be great for the SEO strategy. With regards to the architecture, I have a ‘destinations’ section on the Website where users can access each of the key destinations served by the tour company. My question is – from a planning perspective I can organize my folder structure in a few different ways. http://www.companyurl.com/destinations/touring-regions/cities/ or http://www.companyurl.com/destinations/ http://www.companyurl.com/touring-regionA/ http://www.companyurl.com/touring-regionB/cities-profile/ I am curious if anyone has an opinion on what might perform best in terms of the site structure from an SEO perspective. My fear is taking all of this rich content and placing it so many tiers down in the architecture of the site. Any advice that could be offered would be appreciated. Thanks.
Intermediate & Advanced SEO | | VERBInteractive0 -
Best Structure for Multi-Language/International Website
We are getting ready to do a total redsign of our website, which is a multi-language global website (www.hurco.com). Today we use an ip address lookup to determine country of origin and redirect to say hurco.de for Germany. The main reason for this was that our German division was afraid that their potential customers were going to the hurco.com site and seeing product that was not available to them. Is there a better way from an SEO standpoint to structure our website? Should we have all hurco.com traffic goto a country selection page and let them go there manually? Other good practices we should follow? Would you structure the entire site as //www.hurco.com/en-us or /en-canada (language and country) and then have all international domains 301 redirect to the proper one?
Intermediate & Advanced SEO | | fassnachtp0 -
What are the best suites of SEO tools?
I normally use SEOmoz and a bit of SEMrush but I dont really know much outside of those two. Im looking to do a review of the big, trustworthy ones - along the lines of free trial price vs value ranktracking linkbuilding help onpage analysis and help competitor analysis reports I heard good things about Raven Tools and Web CEO. Ive seen mention of SEOpowersuite on this forum but the site looks spammy as hell Anyone have a view on those 5 tools or any others in a similar vein? Or any other top line criteria I should be looking at? Cheers
Intermediate & Advanced SEO | | firstconversion
Stephen1