Variables in URLS?
-
How much do variables in URLs hurt indexing of that page? I'm worried that with this huge string of variables that the pages won't get indexed.
Here's what I think we should have: http://adomainname.com/New/Local/State/City/Make/Model/
Here's the current URL:http://adomainname.com/New/Local/MN/Bayport/Jeep/Liberty?curPage=1&pageResultSize=50&orderDir=DESC&orderBy=ModifiedDate&conditionId=1&makeId=7&modelId=141&stateProvinceName=Minnesota&mc=1
-
Gotcha thx for the help
-
I'm not finding a great example, offhand, but basically, I'm suggesting that you could have city/state URLs, like:
www.example.com/illinois/chicago
...make model URLs, like:
...and individual product URLs, like:
www.example.com/honda-crv-1234
... but that you DON'T mix the first two and end up with something like:
www.example.com/illinois/chicago/honda/crv
....because you're going to spin out a ton of thin content. There's no perfect way these days to even do a lot of state/city pages - Panda hasn't been kind to that. Ideally, you need to have some sort of unique content for each one. If you're just spinning out content to target keywords, you're likely to harm your site eventually.
-
Hey Pete can you please point out a site that does this.
I know that its not good to add too many links to any page and we should only target two keywords on each page. What is the best way to optimize for every city within a state?
lets say we had 20 cities we want to target, is it best to build 10 pages each one targeting two keywords? what is the best practice for this?
-
Yeah, my gut reaction (although there's no one-sized-fits-all answer for every site) is in line with @blu42 - it's not so much about the folder-depth here, it's that this structure is inevitably going to create "thin" content, possibly by the truckload. Post-Panda, the days of just spinning out state+city+product are pretty much gone. It used to work great for long-tail search. Now, you risk it not only not working, but actually damaging your entire site.
It would be much better, IMO to have some kind of state/city structure but then land on the same make/model page, regardless of geography. You can get some kind of geo-targeting that way AND landing pages for products, but don't try to cross them on every variation. The tiny amount of long-tail traffic you pick up will probably be dwarfed by the problems you'll have.
-
It should get indexed however it is very deep...
Its probably better idea to have it structured http://adomainname.com/Category/Location/Make&Model/
-
Ahhh, you will definitely run into duplicate content issues!
You want to include a rel="canonical" to point to the domain & path without the variables. You don't want this sort of URL to be indexed (use robots: noindex,follow); it's much better to show the clean URL in Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site scraped over 400,000 urls
Our business is heavily dependent on SEO traffic from long tail search. We have over 400,000 pieces of content, all of which we found scraped and published by another site based out of Hong Kong (we're in the US). Google has a process for DMCA takedown, but doing so would be beyond tedious for such a large set of urls. The scraped content is outranking us in many searches and we've noticed a drastic decrease in organic traffic, likely from a duplicate content penalty. Has anyone dealt with an issue like this? I can't seem to find much help online.
Technical SEO | | Kibin0 -
Backlinks that go to a redirected URL
Hey guys, just wondering, my client has 3 websites, 2 of 3 will be closed down and the domains will be permanently redirected to the 1 primary domain - however they have some high quality backlinks pointing the domains that will be redirected. How does this effective SEO? Domain One (primary - getting redesign and rebuilt) - not many backlinks
Technical SEO | | thinkLukeSEO
Domain Two (will redirect to Domain One) - has quality backlinks
Domain Three (will redirect to Domain One) - has quality backlinks When the new website is launched on Domain One I will contact the backlink providers and request they update their URL - i assume that would be the best.0 -
Robots.txt Syntax for Dynamic URLs
I want to Disallow certain dynamic pages in robots.txt and am unsure of the proper syntax. The pages I want to disallow all include the string ?Page= Which is the proper syntax?
Technical SEO | | btreloar
Disallow: ?Page=
Disallow: ?Page=*
Disallow: ?Page=
Or something else?0 -
Sitemaps, 404s and URL structure
Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!
Technical SEO | | DanielFeldman0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
Keyword in URL vs organization
I have a jobs site that currently has the following structure for jobs: www.site.com/jobs/openings/1234.html Categories used to be listed this way: www.site.com/jobs/openings/accounting But we changed it to: www.site.com/jobs/category/accounting Does it matter? Is one better than the other? The page title, heading, and description also have the word "openings" or "opening" in them.
Technical SEO | | cmp1010 -
I have altered a url as it was too long. Do I need to do a 301 redirect for the old url?
Crawl diagnostics has shown a url that is too long on one of our sites. I have altered it to make it shorter. Do I now need to do a 301 redirect from the old url? I have altered a url previously and the old url now goes to the home page - can't understand why. Anyone know what is best practice here? Thanks
Technical SEO | | kingwheelie0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0