Which is The Best Way to Handle Query Parameters?
-
Hi mozzers,
I would like to know the best way to handle query parameters.
Say my site is example.com. Here are two scenarios.
Scenario #1: Duplicate content
example.com/category?page=1
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-headerAll have the same content.
Scenario #2: Pagination
example.com/category?page=1
example.com/category?page=2 and so on.What is the best way to solve both?
Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only.
For solving the duplicate content issue, do we need to use canonical tags on each such URL's?
I am not using WordPress. My site is built on Ruby on Rails platform.
Thanks!
-
The new pagination advice is really tough to navigate. I have mixed feelings about rel=prev/next (hard to implement, doesn't work on Bing, etc.) but it seems generally reliable. If you have pagination AND parameters that impact pagination (like sorts), then you need to use prev/next and canonical tags. See the post Alan cited.
I actually do think NOINDEX works fine in many cases, if the paginated search (pages 2+) have little or no search value. It really depends on the situation and the scope, though. This can range from no big deal at all to a huge problem, depending on the site in question, so it's tough to give general advice.
I'm not having great luck with GWT parameter handling lately (as Alan said), especially on big sites. It just doesn't seem to work in certain situations, and I have no idea why Google ignores some settings and honors others. That one's driving me crazy, actually. It's easy to set up and you can try it, but I wouldn't count on it working.
-
no dont de-index them, just use prev next,
yes you are right it is only for google, i really can not give you an answer as what to do for both, you could use canonical for bing only. its a hard one
see this page, for more info http://googlewebmastercentral.blogspot.com.au/2011/09/pagination-with-relnext-and-relprev.html
-
Which do you think is ideal?
De-Indexing Pages 2+ or simply using the rel=next, rel=prev? That's also only for Google right?
-
For the first senario use a canonical tag.
for the second use the prev next tags, this to google will make page one look like one big page with all the content of all the pages on it.
dont use parrametter handing, it is a last resort, it is only for google (though bing has its own), and its effectiveness has been questioned.
-
The problem is that we are talking about thousands of pages and manually doing it is close to impossible. Even if it can be engineered, it will take a lot of time. Unless Webmaster tools cannot effectively handle this situation, it doesn't make sense to go and change the site code.
-
Hi Mohit,
Seems like a waste of time to me when you can put a simple meta tag in there.
-
How about using parameter handling using Google Webmaster tools to ignore ?page=1, ?order=updated_at+DESC and so on. Does that work instead of including canonical tags on all such pages?
-
I can speak to the first scenario, that is exactly what the purpose of the rel="canonical" is for. Dynamic pages in which have a purpose for url appendages.Or in the rare case where you can't control your server (.httaccess) for 301 redirects.
As for pagination, I may not have the best answer as I have also been using rel="canonical" in those cases as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
Which way round to 301 redirect?
Hi We have just added a new layered navigation menu to our website. so for example we had Before : www.tidy-books.co.uk/chidlrens-bookcases (this has the seo juice) And Now: http://www.tidy-books.co.uk/childrens-bookcases-book-storage/childrens-bookcases Might be a stupid question but do I redirect the 'now' url to the 'before' url or the the other way round I look forward to hearing your thoughts Thanks
Technical SEO | | tidybooks0 -
What is the best way to handle links that lead to a 404 page
Hi Team Moz, I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages. Steps I have taken: Multiple new sitemaps submitted with new URLs and the indexing looks solid used webmasters to remove urls with natural result listings that did not redirect and produce urls Completely built out new ppc campaigns with new URL structures contacted few major link partners Now here is my question: I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
Technical SEO | | mm9161570 -
Whats the best tool for a Sitemap creation?
Hi guys i like to know whats the best tool to create diferent types of Sitemap´s (images, videos, normals). I dont care if is paid.
Technical SEO | | faraujoj0 -
Should I block robots from URLs containing query strings?
I'm about to block off all URLs that have a query string using robots.txt. They're mostly URLs with coremetrics tags and other referrer info. I figured that search engines don't need to see these as they're always better off with the original URL. Might there be any downside to this that I need to consider? Appreciate your help / experiences on this one. Thanks Jenni
Technical SEO | | ShearingsGroup0 -
Best strategy for redirecting domain authority from an acquired site...?
Hi all, I'm an in-house for a company that made several acquisitions last year prior to my starting. I'm just now hearing about several loose-ends websites that belong to companies that have been absorbed by us. The question is how to best approach the task of utilizing that site's domain authority to our site's benefit. There is already a link to the homepage in the header of the site in question (our logo's right under theirs) so we're already getting some linkjuice. Looks like the whois information never changed. Here are the options I'm considering: 1. Blanket redirect (all of their pages there into our home page) - not ideal. 2. Targeted redirect (try to "connect the dots" between content pages with similar subjects/keyword relevance - better than #1, but is it worth the extra effort? 3. More linking (add more strategically placed and keyword optimized links back to our site) - also more work, but certainly do-able if the consensus is to leave the site up. 4. Any other suggestions? Thanks for your help everyone!
Technical SEO | | TGViaWest0 -
Mobile site: robots.txt best practices
If there are canonical tags pointing to the web version of each mobile page, what should a robots.txt file for a mobile site have?
Technical SEO | | bonnierSEO0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0