Query string parameters always bad for SEO?
-
I've recently put some query string parameters into links leading to a 'request a quote' form which auto-fill the 'product' field with the name of the product that is on the referring product page.
E.g.
Red Bicycle product page >>> Link to RFQ form contains '?productname=Red-Bicycle' >>>> form's product field's default value becomes 'Red-Bicycle'
I know url parameters can lead to keyword cannibalisation and duplicate content, we use sub-domains for our language changer. BUT for something like this, am I potentially damaging our SEO?
Appreciate I've not explained this very well. We're using Kentico by the way, so K# macros are a possibility (I use a simple one to fill the form's Default Field).
-
No, I would make sure it's the best use case for you. Sometimes you can store this data in a cookie. But there are still great ways for just SEO that can help you with making sure that it won't hurt. Usually the best way is still having a canonical tag on the page that matches with the page that the content is originally from.
-
Hi Martijn,
Thanks for the reply. Am I going about this completely the wrong way? Would you recommend using local storage instead?
All the best,
Michael
-
Well it would depend on how you set up the data. In the end you can transfer the data from one page to the other in multiple ways (local data storage, cookies, POST). So in most cases you wouldn't even need a parameter like this so you can keep your URLs as clean as possible.
-
Thanks!
I've defined it in Google Search Console and asked it to not be crawled.
It isn't actually for tracking, simply to auto-fill a form for the customer, hopefully to enhance conversions (we sell quite wordy and complex products, some people visit the form, get in a muddle and prefer to ring us than complete the form).
If I was to 301 it back to the original URL, wouldn't that immediately move the user to a URL without the query string?
-
Hi,
Parameters are definitely not always a bad thing, if you use them for filtering or pagination they have a great use case. Also for tracking purposes which I think is what you're talking about here it would be fine. But you probably want to put some support in place to make sure you don't mess up your SEO. You have a few ways of doing that:
- Define the URL parameters in Google Search Console, read more about it here.
- In some cases if it's just for tracking purposes but would trigger an actual pageview or 200 status code. Then it might be good to implement a canonical URL that would link back to the original page.
- Robots.txt, way more aggressive but if you don't want search engines to look at the pages at all you can exclude the parameters through the robots.txt. I would advise against using this if you don't have great SEO knowledge.
Hope this was useful!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breadcrumbs on Mobile How important are they for SEO?
Due to Poor unsightly look of breadcrumbs and the space it takes up above the fold we only employ breadcrumbs on our desktop version. Breadcrumbs are hidden from view on mobile version. However as mobile first indexing is now in play what technical SEO impacts will this have? one thing that comes to mind is crawling deeper pages where breadcrumbs made them accessible in less than 3 link clicks? But i am unsure now of the impacts of not having breadcrumbs visible for mobile version of our site.
Technical SEO | | oceanstorm0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
Content Aggregator Services....good or bad for seo?
I've just had a demo from a content aggregator service called NewsCred. Essentially the service licenses the use of content from multiple sources on our own site. They claim that as the content is all properly referenced back to the original sites there are no SEO implication to the host site. Are they correct? Should we stay away?
Technical SEO | | J_Sinclair0 -
SEO Audit - Panda
I am looking for a reputable SEO company to help diagnose Panda issues. I am very familiar with SEO and lead an in-house team so I need more than a basic audit. e.g You need unique content
Technical SEO | | WEB-IRS
e.g. You need to create quality content I am looking for someone with a technical mind to help diagnose. Please reach if you have someone in mind.0 -
Country domain: Seo for other languages
Hi, I have an italian domain (.it) for an italian hotel, it is an old authoritative domain (1997) and it is well optimized for the keywords that include the city the hotel is in, now the page is decently positioned in Google Italy. There are many problems to have the same rank for German version (in google.de, google.at). The German version is in the /de folder. The hotel has another .com domain, much less authoritative (2007), in a German server, but it was and is only a simple redirect 301 (by code) to the German version in the .it domain. (obviously the rank for this domain is almost nonexistent). Do you have any suggestion? Thank you.
Technical SEO | | depi0 -
Iframes & SEO
I've got a client that wants a site with all content in iFrames. They saw another site they liked & asked if we could do it. Of course we can technically. How big a negative hit would they take with SEO? Is there anything we can do to mitigate it, such as redirects, etc? Thanks for the help!
Technical SEO | | wcksmith0 -
Domain Forwarding and SEO
I have looked around and only saw older and contradicting responses to this question but what effect does having a domain with VALUABLE-KEYWORD.com forward to MAINSITE.com or COMMON-MISSPELLING.com forward to MAINSITE.com in terms of SEO and is it considered spammy or looked down upon
Technical SEO | | treytt0 -
Best SEO strategy for a site that has been down
Because of hosting problems we're trying to work out, our domain was down all weekend, and we have lost all of our rankings. Doe anyone have any experience with this kind of thing in terms of how long it takes to figure out where you stand once you have the site back up? what the best SEO strategy is for immediately addressing this problem? Besides just plugging away at getting links like normal, is there anything specific we should do right away when the site goes back up? Resubmit a site map, etc? Thanks!
Technical SEO | | OneClickVentures0