Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
SEO effect of URL with subfolder versus parameters?
-
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations).
For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL:
https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php
or
http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
-
Thanks Miriam, This is very helpful and makes a lot of sense. What do you think of towns and villages, or boroughs of a large city. Do you think the close proximity is dangerous territory re: keyword permutations?
I take your point about unique content tailored to the people of the city - it makes a lot of sense. But what about locations that are closer to each other?
I know it's a tricky question but any insight would be most welcome.
-
That's a good question, Andrew. It's true that it's no longer a best practice to build out a set of pages featuring slightly different permutations of a keyword (car repair, auto repair, repairing cars, fixing cars, etc.). That approach is now quite dated. Honestly, it never made any sense beyond the fact that when Google wasn't quite so sophisticated, you could trick your way into some additional rankings with this type of redundant content.
The development of location landing pages is different. These are of fundamental use to consumers, and the ideal is to create each city's landing page in a way that is uniquely helpful to a specific audience. So, for example, your store in Detroit is now having a special on winter clothing right now, because it's still snowing there. Meanwhile, your store in Palm Beach is already stocking swim trunks. For a large, multi-location Enterprise, location landing pages can feature highly differentiated content, including highlights of regional-appropriate inventory and specials, as well as unique NAP, driving directions, reviews from local customers, and so much more.
The key to avoiding the trap of simply publishing a large quantity of near-duplicate pages is to put in the effort to research the communities involved and customize these location pages to best fit local needs.
-
Hi Searchout,
Good for you for creating a unique page for each of your locations. I like to keep URLs as simple as possible, for users, so I'd go with:
etc.
From an SEO perspective, I don't think there's a big difference between root URLs and subfolders. If you're using one structure, I doubt you'd see any difference from doing it differently (unless you were using subdomains, which is a different conversation).
-
Of course that cities will be counted.
That´s why im always reinforcing the idea of creating UNIQUE and Special pages for each keyword.
Google is getting smarter and smarter, so simple variations in a few words are easly detected.Hope it helps.
Best luck.
GR. -
Hi
Thanks for your response I'm interested in this too. I've been targeting cities with their own pages but I head recently that google are going to be clamping down on multiple keyword permutations. Do you think cities will be counted in this?
-
Hi there!
In my opinion, for SEO purposes it is correct to have a unique page (really different from other, not just changing the city name and location) por each big city you are optimizing.
Thus said, a subfolder is useful in order to show google the name of the city in the URL. It is common that google considers parameters different than folders.Also, remember to avoid duplicate content. /dallas/ and /dallas/index.php should not be accesible and indexable for google. Redirect one to the other or canonicalize one to the other. Same with www, non-www, http and https versions.
Hope it helps.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL in russian
Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
Intermediate & Advanced SEO | | alexrbrg
http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?0 -
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks
Intermediate & Advanced SEO | | NickG-1230 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
Is there any negative SEO effect of having comma's in URL's?
Hello, I have a client who has a large ecommerce website. Some category names have been created with comma's in - which has meant that their software has automatically generated URL's with comma's in for every page that comes beneath the category in the site hierarchy. eg. 1 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/ eg. 2 : http://shop.deliaonline.com/store/music,-dvd-and-games/dvds-and-blu_rays/action-and-adventure/ etc... I know that URL's with comma's in look a bit ugly! But is there 'any' SEO reason why URL's with comma's in are any less effective? Kind Regs, RB
Intermediate & Advanced SEO | | RichBestSEO0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0