Parameter handling (where to find all parameters to handle)?
-
Google recently said they updated their parameter handling, but I was wondering what is the best way to know all of the parameters that need "handling"? Will Google Webmaster find them? Should the company know based on what is on their site? Thanks!
-
Yeah, Google generally will come across just about all of the parameters while indexing your site, but you can add parameters as well. When you log into Google Webmaster Tools, you should see a list of parameters when you go to the Site configuration > URL parameters page. They've given more options now that you can change for each parameters, beyond whether or not Google should ignore them. If you click Edit for a parameter, now you can set:
- Does this parameter change page content seen by the user? (Yes, No)
- How does this parameter affect page content? (Sorts, Narrows, Specifies, Translates, Paginates, Other)
- Which URLs with this parameter should Googlebot crawl? (Let Googlebot decide, Every URL, Only URLs with value ___, No URLs)
It will also show you sample URLs with the parameter to make it easier to figure out when these parameters appear, which is very useful, as sometimes you don't which pages have which parameters.
Google's help file for this can be found here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
Blocked URL parameters can still be crawled and indexed by google?
Hy guys, I have two questions and one might be a dumb question but there it goes. I just want to be sure that I understand: IF I tell webmaster tools to ignore an URL Parameter, will google still index and rank my url? IS it ok if I don't append in the url structure the brand filter?, will I still rank for that brand? Thanks, PS: ok 3 questions :)...
Technical SEO | | catalinmoraru0 -
Internal link structure, find out if there are any internal links to this page
When i use this url in open site explorer it says that there are no internal links:
Technical SEO | | wilcoXXL
http://goo.gl/d2s6tJ
Page Authority is also 1, it should be higher of there are any internal links to it right? But i am very sure there are links to this url on my website. For example on this URL:
http://goo.gl/ucixRH How certain can i be of this? Because if i can be very certain, than we have a internal linkstructure problem on our entire site i believe.0 -
How to find all crawlable links on a particular page?
Hi! This might sound like a newbie question, but I'm trying to find all crawlable links (that google bot sees), on a particular page of my website. I'm trying to use screaming frog, but that gives me all the links on that particular page, AND all subsequent pages in the given sub-directory. What I want is ONLY the crawlable links pointing away from a particular page. What is the best way to go about this? Thanks in advance.
Technical SEO | | AB_Newbie0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
How to handle a future company expansion?
One of my clients is looking to start a new company and they are thinking of SEO right from the get go. While this is great for me there are a few issues that I have never really encountered before. For instance, my client knows that she will be expanding into a different city in the future but wants to generate local traffic to start with. She will initially start with CITY-A before moving to CITY-B one year later. Which of the following would be a better solution: 1) Have CITY-A targeted on the root domain for one year, build links and grow the site for CITY-A then create two sub domains in one year targeting CITY-A and CITY-B (ie. CITY-A.companyname.com and CITY-B.companyname.com), then make the root domain a generic company site with no mention of location (or mentions of both locations). . . 2) Create the two sub domains now and begin with CITY-A.companyname.com and have the root domain be a general overview of the company and our services without being location specific. 3) Create the root domain (companyname.com) and have that target CITY-A and keep it targeting the initial city, then create a sub-domain in a year to target CITY-B I keep going between these solutions and seem to have hit a mental block. What are your thoughts? Any other ideas are more that welcome! Thanks, Net66
Technical SEO | | net660 -
Parameter Handling - Nourls Question
We're trying to make sense of Google's new parameter handling options and I seem unable to find a good answer to an issue regarding the NoUrl option. For ex. we have two Urls pointing to the same content: http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map?zoom=1&x=0.518&y=0.3965 http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map?zoom=2&x=0.518&y=0.3965 Ideally, I would want Google to index only the main Url without any parameters, so http://www.propertyshark.com/mason/ny/New-York-City/Maps/Manhattan-Apartment-Sales-Map To do this, I would set the value No Urls for the zoom, x and y parameters. By doing this do we still get any SEO value from back links that point to the URLs with the parameters, or will Google just ignore them?
Technical SEO | | propertyshark0 -
Is there a good tool for finding the outbound links on a domain?
Hi, I am trying to find the number and preferably a list of outbound links on a site that has thousands of pages. Is there a good tool that you can recommend? Unless I missed it, I haven't seen this feature in SEOMoz. Thanks!
Technical SEO | | SparkplugDigital0