URL Masking or Cloaking?
-
Hi Guy's,
On our webshop we link from our menu to categories were we want to rank on in Google. Because the menu is sitewide i guess Google finds the categories in the menu important and meaby let them score better (onside links)
The problem that i'm facing with is that we make difference in Gender. In the menu we have: Man and Woman. Links from the menu go to: /categorie?gender=1/ and /category?gender=2/. But we don't want to score on gender but on the default URL.
For example:
- Focus keyword = Shoes
- Menu Man link: /shoes?gender=1
- Menu Woman link: /shoes?gender=2
But we only want to rank on /shoes/. But that URL is not placed in the menu. Every URL with: "?" has a follow noindex.
So i was thinking to make a link in the menu, on man and woman: /shoes/, but on mouse down (program it that way) ?=gender. Is this cloaking for Google?
What we also could do is make a canonical to the /shoes/ page. But i don't know if we get intern linking value on ?gender pages that have a canonical.
Hope it makes senses Advises are also welcome, such as: Place al the default URL's in the footer.
-
That's true, they append parameters tracking where you came from, which looks like it can affect the navigation you're seeing on the left. They're making sure that Google doesn't get confused by using a canonical on their pages, like Mike, Eric and I have recommended.
-
This website is doing the same:
On the left filters... When you hover over a link it's different from the actual URL when visiting it.
-
Hi there,
Like Mike and Eric have said, I'd recommend using a canonical tag on the men's and women's pages to the version of the page that shows both genders' shoes/clothing.
That said, I just want to make sure this is the best path for your site. If it makes more sense for your site to point people to shoes and clothing by gender, shouldn't that be what you show in Google's search results? I'm a woman, and generally search for "women's shoes" since otherwise I often end up on pages that show men's options.
Let us know if these solutions work!
Kristina
-
So to clarify when you say "menu" are you talking about faceted navigation or are you talking about actual page navigation (near header/footer)? If it's faceted, then you should canonical back to the main page so you're not competing with other pages on your site (/mens-shoes or /womens-shoes). If you canonical the men's or women's page back to the main /shoes/ page then you will lose the benefit of those pages.
Does the site only work off of parameters, or do you have separate pages for different genders?
-
You might be better served by using a canonical to point the parameters to the base page. I.E. /shoes?gender=1 with a rel="canonical" pointing at "/shoes". Depends on the variety of the content of the pages, if you're cannibalizing your own keywords, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
How to remove trailing slashes in URLs using .htaccess (Apache)?
I want my URLs to look like these: http://www.domain.com/buy http://www.domain.com/buy/shoes http://www.domain.com/buy/shoes/red Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090 -
Is this Cloaking?
http://www.shopstyle.com/product/sephora-makeup-sephora-collection-glossy-gloss/233883264 This comparison shopping engine url shows googlebot something dramatically different than My frustration is that a comp shop takes retailers content and copies and duplicates it and then uses it to capture traffic and send sales to other retailers other than the original provider of the content. Although this is a javascript function and not explicit bot detection does this qualify as unethical cloaking?
White Hat / Black Hat SEO | | tjgill990