Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URL Structure & Best Practice when Facing 4+ Sub-levels
-
Hi.
I've spent the last day fiddling with the setup of a new URL structure for a site, and I can't "pull the trigger" on it.
- Example: - domain.com/games/type-of-game/provider-name/name-of-game/
- Specific example: - arcade.com/games/pinball/deckerballs/starshooter2k/
The example is a good description of the content that I have to organize. The aim is to a) define url structure, b) facilitate good ux, **c) **create a good starting point for content marketing and SEO, avoiding multiple / stuffing keywords in urls'.
The problem? Not all providers have the same type of game. Meaning, that once I get past the /type-of-game/, I must write a new category / page / content for /provider-name/.
No matter how I switch the different "sub-levels" around in the url, at one point, the provider-name doesn't fit as its in need of new content, multiple times.
The solution? I can skip "provider-name". The caveat though is that I lose out on ranking for provider keywords as I don't have a cornerstone content page for them.
Question: Using the URL structure as outlined above in WordPress, would you A) go with "Pages", or B) use "Posts"
-
I'm not quite sure if I'm correct since I'm not in the gaming industry, but I don't think there would be many benefits in terms of SEO for having provider name in URL. There are a few reasons that I wouldn't add provider name in URL:
- Your URL will be much longer, Moz suggests to keep it within 75 words, including https://www.
- I'm guessing the more important keyword here is the game name, by having publisher name you're moving your important keyword further from the root domain
- It cost more time and effort to manage and create content for each provider page/content
- Provider name is a branded keyword, and it would be hard for you to outrank them in their brand name
- doesn't give much SEO value
When I search for "Battlefield 1", almost all the top results have "Battlefield 1" close to their root domain.
https://www.windowscentral.com/battlefield-1-2018
https://www.gamespot.com/battlefield-1/
https://www.g2a.com/en/battlefield-1-origin-key-global-i10000016618004
https://www.origin.com/sgp/en-us/store/battlefield/battlefield-1#store-page-section-criticalacclaimI think at the end it depends on what keyword you're trying to rank for and does having the publisher name helps.
-
Thank you for both your time and effort in response to my thread, Joseph.
The reason I do not want to use WordPress and its Tags-function, is that it is a WP-function, and not something beneficial to SEO at all. On the contrary, tags create new URLs, duplicate content, and thin content as well.
Continuing on your stating that "relevant content to the keywords", does this mean I can skip the "provider-name" in the url?
I've seen competitors that includes the provider in URL, and they outrank others that do not. I recognize the fact that the URL is not the only signal / factor here, but I'm seeing a trend, hence my question.
-
Hey Dan,
though keyword in URL can be helpful it wasn't that much of a factor in ranking. If your page has relevant content to the keywords Google will be able to tell. Alternatively, since you're using Wordpress, will "tag" be another solution?
Regards,
Joseph Yap
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
Faceted Navigation URLs Best Practices
Hi, We are developing new Products Pages with faceted filters. You can see it here: https://www.viatrading.com/wholesale-products/ We have a feature allowing to Order By and Group By, which alters the order of all products. There will also be the option to view Products as a table, which will contain same products but with different design and maybe slightly different content of each product. All this will happen without changing the URL, https://www.viatrading.com/all/ Is this the best practice? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Best Practices for Converting PDFs to HTML
We're working with a client who gets about 80% of their organic, inbound search traffic from links to PDF files on their site. Obviously, this isn't ideal, because someone who just downloads a PDF file directly from a Google query is unlikely to interact with the site in any other way. I'm looking to develop a plan to convert those PDF files to HTML content, and try to get at least some of those visitors to convert into subscribers. What's the best way to go about this? My plan so far is: Develop HTML landing pages for each of the popular PDFs, with the content from the PDF, as well as the option to download the PDF with an email signup. Gradually implement 301 redirects for the existing PDFs, and see what that does to our inbound SEO traffic. I don't want to create a dip in traffic, although our current "direct to inbound" traffic is largely useless. Are their things I should watch out for? Will I get penalized by Google for redirecting a PDF to HTML content? Other things I should be aware of?
Intermediate & Advanced SEO | | atourgates0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Outranking a crappy outdated site with domain age & keywords in URL.
I'm trying to outrank a website with the following: Website with #1 ranking for a search query with "City & Brand" Domain Authority - 2 Domain Age - 11 years & 9 months old Has both the City & brand in the URL name. The site is crap, outdated.. probably last designed in the 90's, old layouts, not a lot of content & NO keywords in the titles & descriptions on all pages. My site ranks 5th for the same keyword.. BEHIND 4 pages from the site described above. Domain Authority - 2 Domain Age - 4 years & 2 months old Has only the CITY in the URL. Brand new site design this past year, new content & individual keywords in the titles, descriptions on each page. My main question is.... do you think it would be be beneficial to buy a new domain name with the BRAND in the URL & CITY & 301 redirect my 4 year old domain to the new domain to pass along the authority it has gained. Will having the brand in the URL make much of a difference? Do you think that small step would even help to beat the crappy but old site out? Thanks for any help & suggestions on how to beat this old site or at least show up second.
Intermediate & Advanced SEO | | DCochrane0 -
Canonical URLs and Sitemaps
We are using canonical link tags for product pages in a scenario where the URLs on the site contain category names, and the canonical URL points to a URL which does not contain the category names. So, the product page on the site is like www.example.com/clothes/skirts/skater-skirt-12345, and also like www.example.com/sale/clearance/skater-skirt-12345 in another category. And on both of these pages, the canonical link tag references a 3rd URL like www.example.com/skater-skirt-12345. This 3rd URL, used in the canonical link tag is a valid page, and displays the same content as the other two versions, but there are no actual links to this generic version anywhere on the site (nor external). Questions: 1. Does the generic URL referenced in the canonical link also need to be included as on-page links somewhere in the crawled navigation of the site, or is it okay to be just a valid URL not linked anywhere except for the canonical tags? 2. In our sitemap, is it okay to reference the non-canonical URLs, or does the sitemap have to reference only the canonical URL? In our case, the sitemap points to yet a 3rd variation of the URL, like www.example.com/product.jsp?productID=12345. This page retrieves the same content as the others, and includes a canonical link tag back to www.example.com/skater-skirt-12345. Is this a valid approach, or should we revise the sitemap to point to either the category-specific links or the canonical links?
Intermediate & Advanced SEO | | 379seo0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0