Which is The Best Way to Handle Query Parameters?
-
Hi mozzers,
I would like to know the best way to handle query parameters.
Say my site is example.com. Here are two scenarios.
Scenario #1: Duplicate content
example.com/category?page=1
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-headerAll have the same content.
Scenario #2: Pagination
example.com/category?page=1
example.com/category?page=2 and so on.What is the best way to solve both?
Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only.
For solving the duplicate content issue, do we need to use canonical tags on each such URL's?
I am not using WordPress. My site is built on Ruby on Rails platform.
Thanks!
-
The new pagination advice is really tough to navigate. I have mixed feelings about rel=prev/next (hard to implement, doesn't work on Bing, etc.) but it seems generally reliable. If you have pagination AND parameters that impact pagination (like sorts), then you need to use prev/next and canonical tags. See the post Alan cited.
I actually do think NOINDEX works fine in many cases, if the paginated search (pages 2+) have little or no search value. It really depends on the situation and the scope, though. This can range from no big deal at all to a huge problem, depending on the site in question, so it's tough to give general advice.
I'm not having great luck with GWT parameter handling lately (as Alan said), especially on big sites. It just doesn't seem to work in certain situations, and I have no idea why Google ignores some settings and honors others. That one's driving me crazy, actually. It's easy to set up and you can try it, but I wouldn't count on it working.
-
no dont de-index them, just use prev next,
yes you are right it is only for google, i really can not give you an answer as what to do for both, you could use canonical for bing only. its a hard one
see this page, for more info http://googlewebmastercentral.blogspot.com.au/2011/09/pagination-with-relnext-and-relprev.html
-
Which do you think is ideal?
De-Indexing Pages 2+ or simply using the rel=next, rel=prev? That's also only for Google right?
-
For the first senario use a canonical tag.
for the second use the prev next tags, this to google will make page one look like one big page with all the content of all the pages on it.
dont use parrametter handing, it is a last resort, it is only for google (though bing has its own), and its effectiveness has been questioned.
-
The problem is that we are talking about thousands of pages and manually doing it is close to impossible. Even if it can be engineered, it will take a lot of time. Unless Webmaster tools cannot effectively handle this situation, it doesn't make sense to go and change the site code.
-
Hi Mohit,
Seems like a waste of time to me when you can put a simple meta tag in there.
-
How about using parameter handling using Google Webmaster tools to ignore ?page=1, ?order=updated_at+DESC and so on. Does that work instead of including canonical tags on all such pages?
-
I can speak to the first scenario, that is exactly what the purpose of the rel="canonical" is for. Dynamic pages in which have a purpose for url appendages.Or in the rare case where you can't control your server (.httaccess) for 301 redirects.
As for pagination, I may not have the best answer as I have also been using rel="canonical" in those cases as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
Best Practice - Linking out to client websites in niche industry
I have a client in a niche building industry that provides 4 different services to them. She has provided me with a list of 131 past clients of hers that she wants hyperlinked on her site to theirs. The logic is that a lot of these clients are heavy hitters and quite impressive to their peers so the links will be reinforcing my client's value. Is there a best practice for determining whether the link should be follow/no follow? Should I be checking the client's site's spam score, page rank, anything else? Some of these 131 links will be duplicated due to the client performing more than one service for them.
Technical SEO | | JanetJ1 -
Best practices for lazy loading (content)
Hi all, We are working on a new website and we want to know the best practices for lazy loading of google for content.
Technical SEO | | JohnPalmer
My best sample is: bloomberg.com , look at their homepage. Thank y'all!0 -
How to rank 1 page for multiple keywords in the new way
Hi There It has been a little while since I was involved with KW's in earnest. 1.5 years ago and beyond I did really well with SEO. I'm not in a hugely competitive market but we found our keywords, we wrote great web pages for 1,2,3 keywords and when we found more great keywords that we built a new page to rank for. For example: One big hitting keyword was "Rugged PDA", we created a category page for Rugged PDA's. Another was "Rugged Handheld" so we had a new page for that. We then long tailed "semi rugged PDA", "waterproof rugged PDA" etc etc and built sub category pages. We were legit, did lots of content marketing, ran a blog tweeted etc and we did really well to be honest. However these days it's not working, One of Rand's whiteboard sessions stated that you need to build bigger topic based pages that delivered on more keywords (The one about shoes!). This is great as we love that idea as we can have 1 big category page that offers great value to the visitor, however I am struggling to work out how we target a bigger list of keywords to the one page or to fewer pages. To underline this the MOZ page rankers also still seem to work in the same way where they expect 1 or 2 KW's per page to get A ranks to them, so I'm confused!! For example Rugged PDA is an old term, Google trends is showing that it's glory days are over and we know that the term "Rugged Smartphone" is the one to use as we all use smartphones not PDAs these days. However we also see a lot about Rugged + Phone, Mobile, Cell, Handheld, tablet, device, phablet... all relevant to one big category page. So I run these KW's through google search to see if the same pages come up as a test to see if Google thinks they all mean the same, I get a few, but not much overlap. How do we therefore have 1 page that talks about all kinds of great stuff about the "Rugged smartphone" but one that also targets rugged handheld, rugged android device etc etc? I've spent 2 days catching up, i'm none the wiser on this specific element but i'm sure I am just missing one key element of common sense here and any help is very much appreciated. Regards Dave
Technical SEO | | Raptor-crew0 -
301 Clean-Up - Best Practices & Procedure?
Hello Again, I have taken over managing a website for about 2 months and have fixed a whole heap of problems. Im now turning my attention to the URL rewrites as there are ALOT of them. I have fixed the most problematic offenders that were blocking products and all sorts of mischief but I now want to clean them up. The website is on Magento, and there are 240 custom URL rewrites. Question 1: Am i correct that I should edit the links on my website so that they link directly to the new page instead of utilising the re-direct for best SEO results. Question 2: If my website doesn't utilise the URL rewrite (fixed in question 1) its only purpose is to transfer link juice from any external link the page had before. If this page didnt have any external inbound links then I can delete the URL rewrite as it serves no purpose. Question 3: If Q1 and Q2 are correct, what is the quickest way to check the inbound links to a page quickly so I can make a quick decision on if i should remove the re-write. Many Thanks in advance!
Technical SEO | | ATP1 -
I need help to define which is the best friendly url structure
Hi, I need some help to define which is the best friendly url structure for my new project, I'm in doubt for some cases, anyone could help me define which would be the best way? domain.com/buy-online/0-1,this-cool-model or
Technical SEO | | LeonardoLima
domain.com/buy-online/this-cool-model,0-1 or
domain.com/buy-online/0-1/this-cool-model or
domain.com/buy-online/this-cool-model/0-1 or
domain.com/buy-online/this-cool-model_0-1 or
domain.com/buy-online/this-cool-model?Model=0&OtherParam=1 Thanks! Best Regards,
Leonardo Lima0 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
Whats the best way to stop search results from being indexed?
I Have a Wordpress Site, and just realized that the search results are being indexed on Google creating duplicate content. Whats the best way for me to stop these search result pages from being indexed without stopping the regulars and important pages and posts from being indexed as well? **The typical search query looks like this: ** http://xxx.com/?s=Milnerton&search=search&srch_type AND this also includes results that are linked to the "view more" such as:
Technical SEO | | stefanok
http://xxx.com/index.php?s=viewmore Your help would be much appreciated. regards Stef0