Q Parameters
-
I'm having several site issues and I want to see if the Q parameter in the URL is the issue.
Both of these index. Any capitalization combination brings up another indexed page:
http://www.website.com/index.php?q=contact-us.
and
http://www.website.com/index.php?q=cOntact-us
The other issue is Google crawl errors. The website has received increasingly more spam crawl errors. I've read that this is a common issue and most likely is a Google Bot problem. Would removing the q parameter fix this entirely?
Here is an example:
http://www.website/index.php?q=uk-cheap-chloe-bay-bag-wholesale-shoes
-
Thanks Ryan. I'm going to remove the parameters. I'm glad these issues are related.
-
Using a parameter to determine which page to bring up is problematic.
Search engines are getting better at crawling these types of URLs, but why leave anything to chance? And from the human perspective a traditional static URL is lightyears better.
Which is easier for you to tell a friend about?
coolsite.com/neat-article or coolsite.com/index.php?q=neat-article
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Role of Robots.txt and Search Console parameters settings
Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?
Technical SEO | | LivDetrick0 -
Adding a parameter to the URL / URL Stracture
Dear Community, I would like to ask a question regarding url structure. We are struggling with shorting urls and we thought to add a "parameter" to the url. Example: domain.com/product**/a/** or domain.com**/a/**product/ Current url structure: domain.com/product/ So we go after and short url contains "/a/" and find the category we want. Is this going to harm our SEO strategies? Any idea is welcome.
Technical SEO | | geofil0 -
Blocked URL parameters can still be crawled and indexed by google?
Hy guys, I have two questions and one might be a dumb question but there it goes. I just want to be sure that I understand: IF I tell webmaster tools to ignore an URL Parameter, will google still index and rank my url? IS it ok if I don't append in the url structure the brand filter?, will I still rank for that brand? Thanks, PS: ok 3 questions :)...
Technical SEO | | catalinmoraru0 -
Similar pages: noindex or rel:canonical or disregard parameters?!
Hey all! We have a hotel booking website that has search results pages per destinations (e.g. hotels in NYC is dayguest.com/nyc). Pages are also generated for destinations depending on various parameters, that can be star rating, amenities, style of the properties, etc. (e.g. dayguest.com/nyc/4stars, dayguest.com/nyc/luggagestorage, dayguest.com/nyc/luxury, etc.). In general, all of these pages are very similar, as for example, there might be 10 hotels in NYC and all of them will offer luggage storage. Pages can be nearly identical. Come the problems of duplicate content and loss of juice by dilution. I was wondering what was the best practice in such a situation: should I just put all pages except the most important ones (e.g. dayguest.com/nyc) as noindex? Or set it as canonical page for all variations? Or in google webmaster tool ask google to disregard the URLs for various parameters? Or do something else altogether?! Thanks for the help!
Technical SEO | | Philoups0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Pagination and SEO: How do I fix it during search parameters?
Today, I have watched very interesting video on YouTube about Pagination and SEO. I have implemented pagination with rel="next" and rel="prev" on my paginated page. You can get more idea by visit following pages. www.vistastores.com/patio-umbrellas www.vistastores.com/patio-umbrellas?p=2 www.vistastores.com/patio-umbrellas?p=3 I have added NOINDEX FOLLOW attribute to page 2, page 3 and so on. There is simple question from my side. Can I remove NOINDEX FOLLOW attribute from paginated page or not? I have big confusion & issues when paginated URLs contain search parameters. You can get more idea by visiting following URLs. http://www.vistastores.com/patio-umbrellas?dir=asc&order=name&p=2 http://www.vistastores.com/patio-umbrellas?dir=asc&order=name&p=3 What is best suggestion for this kind of pages?
Technical SEO | | CommercePundit0 -
Good technical parameters worst load time.
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up? This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
Technical SEO | | sesertin0 -
Do links count if they have no href parameter?
A SEOmoz report indicates that we have a large number of links on our pages, mainly due to an embedded mega-drop down and lots of product display options that are user activated, but otherwise hidden. Most of the links have the paramter href="#", because the links are used in combination with jQuery to trigger actions. It is still possible to trigger the actions without the href parameter, so the question is: Do links without href parameters count towards the total amount of links on the page, since a link without a href parameter is actually an internal page link? Our site (this version of the site has not had empty tags removed): http://emilea.be/
Technical SEO | | Webxtrakt0