Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to fix issues regarding URL parameters?
-
Today, I was reading help article for URL parameters by Google.
http://www.google.com/support/webmasters/bin/answer.py?answer=1235687
I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax.
URLs:
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit=Now, I am confuse. Which is best solution to get maximum benefits in SEO?
-
No i dont think so, even if the thought they were duplicate, then they will pick one as the original. so one of them will rank.
If you are still concerned use tha canonicall tag, rather them remove them from index
-
Your concern is that, Google will crawl following all pages. If I will not do any thing with those pages. Right?
http://www.vistastores.com/table-lamps
http://www.vistastores.com/table-lamps?limit=100&p=2
http://www.vistastores.com/table-lamps?limit=60&p=2
http://www.vistastores.com/table-lamps?limit=40&p=2
Now, my website is on 3rd page of Google for Discount Table Lamps keyword.
I have fear that, If Google will crawl multiple pages with duplicate Title tag so it may mesh up current ranking for Discount Table Lamps keyword.
What you think about it?
-
If the content is different, then dont do anything, but if it is duplicate us ethe canonical tag.
The meta tages are not a problem, you are not going to get flaged for that, it would be better if you could make them unique but this is a very small problem
-
Will it really work? Because, both page have different content.
http://www.vistastores.com/table-lamps have 100 products and
http://www.vistastores.com/table-lamps?limit=100&p=2 have different + unique 100 products.
One another problem is regarding Meta info. Both page have same Meta info. If Google will index both pages so it may create warning message for duplicate Meta info across too many pages.
-
Thats the advice i gave you, put a canonical tag in the page
rel="canonical" href="http://www.vistastores.com/table-lamps"/>
if google finds http://www.vistastores.com/table-lamps?dir=asc&order=name
it will know it5 is mean to be http://www.vistastores.com/table-lamps
-
Honestly, I did not getting it. Because, I have read one help article about URL parameters by Google.
It shows me some different thing. Google suggested to use Google webmaster tools. But, I have restricted all dynamic pages by robots.txt.
So, I want to know best practice which may help me to gain my crawling and no of indexed pages.
-
I would simplty put a rel canonical in the page point ing to the true URL. so SE's will see them as one page.
It is better to use cononical for the reasons in the google doc you posted, goolge may not pick the nest url to be the canonical, you should make that choice for them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks
Intermediate & Advanced SEO | | NickG-1230 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Pagination parameters and canonical
Hello, We have a site that manages pagination through parameters in urls, this way: friendly-url.html
Intermediate & Advanced SEO | | teconsite
friendly-url.html?p=2
friendly-url.html?p=3
... We've rencently added the canonical tag pointing to friendly-url.html for all paginated results. In search console, we have the "p" parameter identified by google.
Now that the canonical has been added, should we still configure the parameter in search console, and tell google that it is being use for pagination? Thank you!0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
URL mapping for site migration
Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
What is the best URL structure for categories?
A client's site currently uses the URL structure: www.website.com/�tegory%/%postname% Which I think is optimised fairly well, as the categories are keywords being targeted. However, as they are using a category hierarchy, often times the URL looks like this: www.website.com/parent-category/child-category/some-post-titles-are-quite-long-as-they-are-long-tail-terms Best practise often dictates (such as point 3 in this Moz article) that shorter URLs are better for several reasons. So I'm left with a few options: Remove the category from the URL Flatten the category hierarchy Shorten post titles two a word or two - which would hurt my long tail search term traffic. Leave it as it is What do we think is the best route to take? Thanks in advance!
Intermediate & Advanced SEO | | underscorelive0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2