URL Structure
-
Hi Guys,
I'm in the process of creating a very exciting startup aimed at the baby industry. It's essentially a social commerce question where parents can shop for products, create lists of products and ask questions.
The challenge I'm facing is how best to structure my URLs from an SEO standpoint. For example a common baby topic such as "feeding", can sit in all three categories:
- Shopping category aggregates all products related to feeding
- List category aggregates all lists related to feeding
- Question category aggregates all question and answers on feeding
So for that keyword "feeding" you have 3 potential landing pages. What I was wondering is what is the most effective way of doing it? I was thinking of something along these lines:
- /shopping/feeding
- /baby_list/feeding
- /ask/feeding
Would love to hear your points of view on this.
Thanks!
Walid
-
Hi,
I agree with your suggestion. This is the best way of doing your new urls.
- /shopping/feeding/
- /baby-list/feeding/
- /ask/feeding/
Thanks
Tahir
-
Hi! We're going through some of the older unanswered questions and seeing if people still have questions or if they've gone ahead and implemented something and have any lessons to share with us. Can you give an update, or mark your question as answered?
Thanks!
-
I was not only considering the visitor who can guess, but also Google. Perhaps by being less specific with the category "list" meant that it could further divide into premature / newborn / toddler etc as the market dictates.
Just a thought
-
I would stick with:
- /shopping/feeding/
- /baby-list/feeding/
- /ask/feeding/
I believe a visitor should be able to pretty much guess the URL if they know what they want. That's how I tend to come up with mine. If I want to shop I may write shop or shopping and then I'd go on to maybe baby feeding or just feeding and then I may want a bottle so bottle goes next; so you're on the right track.
-
Hi Walid - if that is what your landing page is about then I believe it won't be a problem (anyone else have a view?), and is "what is says on the tin" - Just be sure to keep your page titles, meta descriptions etc different though.
Hope that helps
-
Thanks for your prompt reply PH292.. but lets assume that you have those same topics for Lists and Asks so for example you could have:
/shopping/toddler-feeding
/list//toddler-feeding
/ask/toddler-feeding
Still not sure if this is the best approach for a URL Structure
-
HI Walid - It is difficult to give a concise answer without seeing the complete URL howerver, my one tip would be when you are listing "feeding" for example - it could be feeding anything so you could have (based on your research of course) for example
/shopping/baby-feeding
/shopping/toddler-feeding
that way it becomes a bit more optomized and surfer friendly
If the key part of the URL left you in no doubt then you could have
/feeding-tips
/feeding-products
/feeding-accessories
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
URL Parameters to Ignore
Hi Mozers, **We have a glossary of terms made up of a main page that lists out ALL of the terms, and then individual pages per alphabet letter that limit the results to that specific alphabet letter. These pages look like this: ** https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=A https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=B https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=C https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=D etc. If I'd like Google to remove all of these "expand=" pages from the index, such that only the main page is indexed, what is the exact parameter that I should ask Google to ignore in Search Console? "expand=" ? Just want to make sure! Thanks for the help!!!
Technical SEO | | yaelslater1 -
Canonical URL
Hi there Our website www.snowbusiness.com has a non www version and this one has 398 backlinks. What is the best way of transfering this link value if i establish the www. address as the canonical URL? Thanks, Ben
Technical SEO | | SnowFX0 -
Home page URL
Hi, I work on this site: http://www.towerhousetraining.co.uk/about-us. This is the home page URL. Should this be 301'd to: http://www.towerhousetraining.co.uk? I have created a site map, which I submitted to Google Webmaster Tools, which includes these URL's: /about-us, /training-we-offer & /contact-us. There are a total of 3 pages on the website. Webmaster tools has only indexed 2 out of 3 pages. I think this is something to do with the /about-us URL, as when I do a site: search, these pages appear: www.towerhousetraining.co.uk/, /training-we-offer & /contact-us. I am not sure why Google has indexed the home page as www.towerhousetraining.co.uk/ and not /about-us? Is it a bad idea in general not to have your homepage as your root domain? I added a to the homepage, but am wondering if this was the right thing to do? Any help would be appreciated.
Technical SEO | | CWseo0 -
Making URLs automatically clickable
Hi all, I have a PHP function which i use to make all links clickable. Problem is, if some one writes a link in a a-tag, the URL inside the href value is made clickable. Not good. Can someone perhaps help me with the issue? Function: function makeClickableLinks($text)
Technical SEO | | rasmusbang
{
$text = preg_replace('/<//', ' $text = preg_replace('(
)', '
', $text);
$text = preg_replace('!((https?://www.|https?://|www.)(([a-z0-9-]+.)+[a-z]{2,6})(/\S+|/)*)!ie', '"[".shortenurl("\1")."]("".(strtolower('$2'))"', $text);
$text = str_replace('( <a href',="" '<a="" $text);<br="">$text = str_replace(')" target', '" target', $text);
$text = str_replace('):" target', '" target', $text);
$text = str_replace(')..." target', '" target', $text);
$text = str_replace(').." target', '" target', $text);
$text = str_replace(')." target', '" target', $text);
return $text;
}</a> <a href',="" '<a="" $text);<br="">Pleeeeease heeelp 🙂 Can't fix it on my own - been at it for ages. -Rasmus</a>0 -
/out/ URLs in GWMTs
I am recently seeing some URLs come up as 404s in GWMTs for a client. They look like this: http://client-url/out/www.linkedin.com/company/client-linkedin-name /out/client-url/sub-directory/postname/ We thought they might have something to do with the social plugins but they are all over the place and they are sometime for internal pages on the site. Anyone run into these and know why they are happening?
Technical SEO | | DragonSearch0 -
301 an old URL with a ? in the URL?
I am redoing a site and the URL's are changing structure. The client's site was in magento and in the store they would get two URLs, for example: /store/categoryname/productname and /store/categoryname/productname?SID=dslkajsfdoiu947598whouieht983hg98 Do I have to 301 redirect both of these URL's to their new counterpart? Both go to the same content but magento seemed to add these SIDs into the navigation and Google has both versions in the index.
Technical SEO | | DanDeceuster0