Parameter handling (where to find all parameters to handle)?
-
Google recently said they updated their parameter handling, but I was wondering what is the best way to know all of the parameters that need "handling"? Will Google Webmaster find them? Should the company know based on what is on their site? Thanks!
-
Yeah, Google generally will come across just about all of the parameters while indexing your site, but you can add parameters as well. When you log into Google Webmaster Tools, you should see a list of parameters when you go to the Site configuration > URL parameters page. They've given more options now that you can change for each parameters, beyond whether or not Google should ignore them. If you click Edit for a parameter, now you can set:
- Does this parameter change page content seen by the user? (Yes, No)
- How does this parameter affect page content? (Sorts, Narrows, Specifies, Translates, Paginates, Other)
- Which URLs with this parameter should Googlebot crawl? (Let Googlebot decide, Every URL, Only URLs with value ___, No URLs)
It will also show you sample URLs with the parameter to make it easier to figure out when these parameters appear, which is very useful, as sometimes you don't which pages have which parameters.
Google's help file for this can be found here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to add parameter to url with 301 and wildcard
So this is my situation. I want to redirect : example.com/post1/ to example.com/post1/?m=yes
Technical SEO | | CarlLSweet0 -
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
Can someone PLEASE help me find a solution to use a custom search engine for iPhones?? Thanks in advance!
Hey mozzers, I'm in quite the pickle today and would really appreciate some help! i need a way to have my members set their default custom search engine on their iPhones and androids to our sites search engine. Google chrome does on desktops but not iPhones. Thanks for your time/help, Tyler Abernethy
Technical SEO | | TylerAbernethy0 -
How to handle large numbers of comments?
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k! As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing! What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account. Does anyone have any particular recommendations? Options I've considered are: Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!) Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?) Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors) How do active comments on a page contribute to an article's freshness? Any thoughts would be greatly appreciated.
Technical SEO | | DougRoberts2 -
How to find all the links to my site
hi i have been trying to find all the links that i have to my site http://www.clairehegarty.co.uk but i am not having any luck. I have used the open explorer but it is not showing all the links but when i go to my google webmaster page it shows me more pages than it does on the semoz tool. can anyone help me sort this out and find out exactly what links are going into my site many thanks
Technical SEO | | ClaireH-1848860 -
Best Way To Handle Expired Content
Hi, I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available. The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per). The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available. Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site? Thanks for any insights you can offer.
Technical SEO | | Matthew_Edgar
Matthew1 -
How to find a specific link on my website (currently causing redirects)
Hi everyone, I've used crawlers like Xenu to find broken links before, and I love these tools. What I can't figure out is how to find specific pieces of code within my site. For example, Webmaster Tools tells me there are still links to old pages somewhere on my website but I just can't find them. Do you know of a crawler that can search for a specific link within the html? Thanks in advance, Josh
Technical SEO | | dreadmichael0