Adding parameters in URLs and linking to a page
-
Hi,
Here's a fairly technical question:
We would like to implement badge feature where linking websites using a badge would use urls such as:
domain.com/page?state=texas&city=houston
domain.com/page?state=neveda&city=lasvegas
Important note: the parameter will change the information and layout of the page: domain.com/page
Would those 2 urls above along with their extra parameters be considered the same page as domain.com/page by google's crawler?
We're considering adding the parameter "state" and "city" to Google WMT url parameter tool to tel them who to handle those parameters.
Any feedback or comments is appreciated!
Thanks in advance.
Martin
-
Thanks Paul. You confirmed our understanding.
Another head is always better!
Thanks again.
Martin
-
Sorry for the misunderstanding, Martin. I was misled by this statement:
... the parameter will change the information and layout of the page
If the content of the page will truly be different depending on the parameter, the search engines are probably going to consider them separate URLs no matter what you do.. That's their whole definition of separate pages.
If the parameter simply rearranges a page's content (e.g. creates a different sort order for the same products) then combining the pages can usually work.
If you want to try to combine these pages for ranking purposes anyway, the tool to use is the rel=canonical tag. You insert it into the header of the custom parameter pages pointing back to the primary page.
Google is clear though that canonical tags are taken as suggestions only, and if it thinks your pages should actually be indexed separately, it will ignore the canonical.
You could then back this process up by using Bing & Google WMT to specify that the ?state= and city= parameters are to be ignored. But the above caveat still applies.
Paul
-
Thank you for your feedback Paul.
Well, I guess I didn't explained myself correctly because we do want to have all the parameters considered 1 page!
Our goal is 2 fold:
1- Have the parameters in the URL considered 1 page in order to have 1 page with many links pointing to it in order to have higher page authority (instead of 20+ pages with lower authority).
2- Keep the user experience relevant by display and organizing the page content based on the referring url (in our case the links from badges).
So in other words how do we (if possible) make sure google and bing consider all parameters as combined page?
Thanks again!
Martin
-
You're actually in the reverse position of most people dealing with URL parameters, Martin. By default, the Search Engines consider different parameters to be different pages and most users are struggling with how to make the SEs understand they're all one page to avoid duplicate content issues.
In your case, you want the SEs to treat those pages according to the way the SEs normally do, which means you "shouldn't" need to do anything extra to get them indexed separately. That's what's naturally going to happen.
I'm with you on the hinting in Google Webmaster Tools though. May as well use that capability to confirm for Google that you do want those parameters indexed. You should do the same thing in Bing Webmaster Tools as well. Kind of a "belt & suspenders" approach.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with high link juice/low value pages?
How do people deal with low value pages on sites which tend to pool pagerank and internal links? For example log in pages, copyright, privacy notice pages, etc. I know recently Matt Cutts did a video saying don't worry about them, and in the past we all know various strategies like nofollow, etc. were effective but no more. Are there any other tactics or techniques with dealing with these pages and leveraging them for SEO benefit? Maybe having internal links on these pages to strategically pass off some of the link juice?
Technical SEO | | IrvCo_Interactive0 -
Linking without loosing link equity.
Hi, I was wondering if anyone had a solution to linking without loosing link equity? From what I have read using 'no follow' on both internal and external links DOES NOT pass any equity across the link to the link target, but also, the latest thought goes that it DOES loose link equity (as if it were a FOLLOW' link). So is there a method of retaining link equity using another method? Thanks
Technical SEO | | James770 -
URL Structure for "Find A Professional" Page
I've read all the URL structure posts out there, but I'm really undecided and would love a second opinion. Currently, this is how the developer has our professionals directory working: 1. You search by inputting your Zip Code and selecting a category (such as Pool Companies) and we return all professionals within a X-mile radius of that ZIP. This is how the URL's are structured... 1. Main Page: /our-professionals 2. The URL looks like this after a search for "Deck Builders" in ZIP 19033: /our-professionals?zipcode=19033&HidSuppliers=&HiddenSpaces=&HidServices=&HidServices_all=[16]%2C&HidMetroareas=&srchbox= 3. When I click one of the businesses, URL looks like this: viewprofile.php?id=409 I know how to go about doing this, but I'm undecided on the best structure for the URL's. Maybe for results pages do this: find-professionals/deck-builders/philadelphia-pa-19033 And for individual pro's profiles do this: /deck-builders/philadelphia-pa-19033/Billys-Deck-Service Any input on how to best structure this so that we can have a good chance of showing in SERPs for "Deck Builders near New Jersey" and the such, would be much appreciated.
Technical SEO | | zDucketz0 -
Similar pages: noindex or rel:canonical or disregard parameters?!
Hey all! We have a hotel booking website that has search results pages per destinations (e.g. hotels in NYC is dayguest.com/nyc). Pages are also generated for destinations depending on various parameters, that can be star rating, amenities, style of the properties, etc. (e.g. dayguest.com/nyc/4stars, dayguest.com/nyc/luggagestorage, dayguest.com/nyc/luxury, etc.). In general, all of these pages are very similar, as for example, there might be 10 hotels in NYC and all of them will offer luggage storage. Pages can be nearly identical. Come the problems of duplicate content and loss of juice by dilution. I was wondering what was the best practice in such a situation: should I just put all pages except the most important ones (e.g. dayguest.com/nyc) as noindex? Or set it as canonical page for all variations? Or in google webmaster tool ask google to disregard the URLs for various parameters? Or do something else altogether?! Thanks for the help!
Technical SEO | | Philoups0 -
How do I 301 redirect a number of pages to one page
I want to redirect all pages in /folder_A /folder_B to /folder_A/index.php. Can I just write one or two lines of code to .htaccess to do that?
Technical SEO | | Heydarian0 -
"Too Many On-Page Links" Issue
I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page. The site is here - http://www.ibethel.org/ Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?
Technical SEO | | BethelMedia0 -
Existing Pages in Google Index and Changing URLs
Hi!! I am launching a newly recoded site this week and had a another noobie question. The URL structure has changed slightly and I have installed a 301 redirect to take care of that. I am wondering how Google will handle my "old" pages? Will they just fall out of the index? Or does the 301 redirect tell Google to rewrite the URLs in the index? I am just concerned I may see an "old" page and a "new" page with the same content in the index. Just want to make sure I have covered all my bases. Thanks!! Lynn
Technical SEO | | hiphound0