URL Parameter for Limiting Results
-
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions.
1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category?
In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes.
Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page.
2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do.
Any advice or suggestions will be helpful! Thanks.
-
Thanks for your help David - I apologize for my delayed response.
-
Hi Dustin,
Looks like the problem is that you have two canonical tags on your parameter pages.
eg. on lines 24 and 25 of the source code for this page https://www.stickylife.com/custom/vinyl-decals?limit=30 you'll see:
With more than one canonical tag on a page, Google will ignore both canonical tags - which is why you are getting duplicate issues.
You'll need to remove the second canonical tag to overcome your issues.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Fetch and Render - Partial result (resources temporarily unavailable)
Over the past few weeks, my website pages have been showing as partial in the Google Search Console. There are many resources/ files (js, css, images) that are 'temporarily unreachable'. The website files haven't had any structural changes for about 2 years (it historically has always shows as 'completed' and rendered absolutely fine in the search console). I have checked and the robots.txt is fine as is the sitemap. My host hasn't been very helpful, but has confirmed there are no server issues. My website rankings have now dropped which I think is due to these resources issues and I need to clear this issue up asap - can any one here offer any assistance? It would be hugely appreciated. Thanks, Dan
SERP Trends | | dan_550 -
Which is the best and quick way to remove URL(s) from Google,Bing search engines?
"Remove URL", "Set Expiry in meta tag", "no index no follow " or some thing else.
SERP Trends | | ankit.rahevar0 -
External URLs - ones you can't reach out
My fellow Mozzers, I have been reviewing our Google Webmaster error reports and noticed high url errors. These URLs are generated from international sites, mainly China. Upon further inspection, these look to be links to dynamic URLs that are no longer active on our site. (search pages) China is linking to old URLs that simply spew out a 'Bad Request' pages now. Problems I face is that: I can't contact these chinese sites to remove/edit the URLs. I could work with my developers to identify the urls and direct them all to the homepage, but is that good. The URLs are still present. Some of these look like pages that haven't been updated in a while, so now I have links from sites that are archived, or "dead" Have you tackled anything like this before? Thoughts are welcome Thanks
SERP Trends | | Bio-RadAbs0 -
Long URL Warning
Dear experts, I have 1490 warnings for having long URLs These URLs are generates automatically by Prestashop from the product title and they are very readable. Can you please direct me what's the impact of these long URLs on my SEO and how can I reduce them if they are automatically generated? Regards,
SERP Trends | | kanary0 -
How to get Google Results for Did You Mean | Showing results for
If someone misspells our company name in Google, how do I get google to display **Did You Mean: **xyz. Our company name is difficult to spell and could be spelled multiple ways. What is the trick to this?
SERP Trends | | hfranz0 -
Does Google index search results pages for other search engines?
I've noticed a lot of backlinks to sites consist of SERPS from other search engines that Google. A link to a query like: http://searcheninge.com/?q=apple for instance. Does Google index these links and do they give any value? Regards, Henrik
SERP Trends | | euromillions0 -
Why are search results different for 'Yahoo Search' and Powered by Yahoo search?
So if you misspell something (no dns to connect with... unregistered domain as an example) on a browser that uses a Road Runner connection, it takes you to a search engine 'powered by yahoo search' which searches for whatever you typed in. Anyway, today we noticed that the website of one of our clients isn't coming up in this 'Powered by Yahoo Search' via roadrunner but comes up in Yahoo Search for the same query. In fact, we noticed that our clients website (10 yrs + established site) is missing in yahoo powered by search but on all other engines. Any clues? I thought this should be generally the same across Yahoo / Yahoo Powered / Bing / Bing Powered. Could this be using something pre-bing merger?
SERP Trends | | qlkasdjfw0