What this site is doing? Does it look like cloaking to you?
-
Hi here,
I was studying our competitors SEO strategies, and I have noticed that one of our major competitors has setup something pretty weird from a SEO stand point for which I would like to know your thoughts about because I can't find a clear explanation for it.
Here is the deal: the site is musicnotes.com, and their product pages are located inside the /sheetmusic/ directory, so if you want to see all their product pages indexed on Google, you can just type in Google:
site:musicnotes.com inurl:/sheetmusic/
Then you will get about 290,000 indexed pages. No, here is the tricky part: try to click on one of those links, then you will get a 302 redirect to a page that includes a meta "noindex, nofollow" directive.
Isn't that pretty weird? Why would they want to "nonidex, nofollow" a page from a 302 redirect? And how in the heck the redirecting page is still in the index?!! And how Google can allow that?!
All this sounds weird to me and remind me spammy techniques of the 90s called "cloaking"... what do you think?
-
Sure I will! Thanks!
-
If you still need SEO and/or programming advice/work done after the summer let me know
-
Ok, nice to know.. we are always looking for passionate people that can work with us. Thanks!
-
At the moment I am very busy with a couple of projects. In general I do work as a SEO consultant.
Actually i'm a programmer, but down the line I started to fall in love with SEO and started to do that too. -
Yes, I'd like to know that tool.
A question: do you offer SEO consultation?
Thank you again Wesley.
-
Apperently Google keeps the original URL in the index as the source. It some ways it makes sense to do this.
It is still a pretty weird trick and I still don't know a good reason to do this. Would like to know if their are any consequences to this weird 'technique'. -
Thanks Wesley, that makes sense... but what's most weird to me is that Google keeps their pages in the index despite this trick... unless the 302 redirect allows legitimately that (maybe for a limited time)?
-
I don't think the word 'cloaking' is the right word since that is hiding content from users which you do want to present to the search engines. It is pretty weird though. A 302 should be a temporarily redirect and that they want to no-index the link it redirects to could make sense in some way.
If they are planning on changing the website then they could be temporarily redirecting the url's to new ones which they don't want to be indexed. When they have made the necessary changes they will remove the redirect and possibly the no-index pages.
Seems like a weird workaround but i've seen people thinking in weirder ways before.
It's more probably that they suffered from a panda or penguin update and that just like you they thought they could recover with a no-index (and a redirect?).Pretty weird story, curious to see if anyone else has some kind of explanation to why someone would set their site up like this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
International Site Migration
Hi guys, In the process of launching internationally ecommerce site (Magento CMS) for two different countries (Australia and US). Then later on expand to other countries like the UK, Canada, etc. The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk A lot of the content between these English based countries are the same. E.g. same product descriptions.
Intermediate & Advanced SEO | | jayoliverwright
So in order to prevent duplication, from what I’ve read we will need to add Hreflang tags to every single page on the site? So for: Australian pages: United States pages: Just wanted to make sure this is the correct strategy (will hreflang prevent duplicate content issues?) and anything else i should be considering? Thankyou, Chris0 -
Reindexing a site with www.
We have a site that has a mirror - i.e. www.domain.com and domain.com - there is not redirect both url's work and show pages so basically a site with 2 sets of URLs for each page. We have changed it so the domain.com and all assorted pages 301 redirect to the right URL with www. i.e. domain.com/about 301's to www.domain.com/about In the search engines the domain.com is the site indexed and the only www. page indexed is the homepage. I checked in the robots.txt file and nothing blocking the search engines from indexing both the www. and non www. versions of the site which makes me wonder why did only one version get indexed and how did the clients avoid a duplicate content issue? Secondly is it best to get the search engines to unidex domain.com and resubmit www.domain.com for the full site? We are definately staying with the www.domain.com NOT domain.com so need to find the best way to get the site indexed with www. and remove the non www. Hope that makes sense and look forward to everyone's input.
Intermediate & Advanced SEO | | JohnW-UK0 -
Large Site - Complete Site URL Change and How to Preserver Organic Rankings/Traffic
Hello Community, What is your experience with site redesign when it comes to preserving the traffic? If a large enterprise website has to go through a site-wide enhancement (resulting in change of all URLs and partial content), what do you expect from Organic rankings and traffic? I assume we will experience a period that Google needs to "re-orientate" itself with the new site, if so, do you have similar experience and tips on how to minimize the traffic loss? Thanks
Intermediate & Advanced SEO | | b.digi0 -
Why do some sites have several types of sitemap?
Hello Mozzers, I often seem to work on websites with several types of sitemaps - e.g. an html sitemap - an xml sitemap - almost always with identical structure and content. Does anybody know the thinking behind this? Currently looking at site with php and xml sitemap sitting alongside one another. I'm guessing one is for site users to read (and also to aid indexing) and the other for search engines, to further aid indexing. Does Google have any preferences? Is there anything you should be wary of re: Google, if there are multiple sitemaps?
Intermediate & Advanced SEO | | McTaggart0 -
On-Site Directory - Delete or Keep?
We have 2 ecommerce sites. Both have been hit by Penguin (no warnings in WMT) and we're in the process of cleaning up backlinks. We have link directories on both sites. They've got links that are relevant to the sites but also links that aren't relevant. And they're big directories - we're talking thousands of links to other sites. What's the best approach here? Do we leave it alone, delete the whole thing, or manually review and keep highly relevant links but get rid of the rest?
Intermediate & Advanced SEO | | Kingof50 -
How can we improve the seo on our site?
Hello everyone. I have been reading through this site for a while and tried to put everything together that I have learned so far. Would any of you mind looking at our site and providing any pointers or areas we can still improve on or areas I completely missed. I appreciate any feedback you can give! Our site is faithology.com Thanks again! Brandon
Intermediate & Advanced SEO | | BMPIRE0 -
Google penalized site--307/302 redirect to new site-- Via intermediate link—New Site Ranking Gone..?
Hi, I have a site that google had placed a manual link penalty on, let’s call this our
Intermediate & Advanced SEO | | Robdob2013
company site. We tried and tried to get the penalty removed, and finally gave up and purchased another name. It was our understanding that we could safely use either a 302 or 307 temporary redirect in order to redirect people from our old domain to our new one.. We put this into place several months and everything seemed to be going along well. Several days ago I noticed that our root domain name had dropped for our selected keyword from position 9 to position 65. Upon looking into our GWT under “Links to Your site” , I have found many, many, many links which were pointed to our old google penalized domain name to our new root domain name each of this links had a sub heading “Via this intermediate link -> Our Old Domain Google Penalized Domain Name” In light of all of this going on, I have removed the 307/302 redirect, have brought the
old penalized site back which now consists of a basic “we’ve moved page” which is linked to our new site using a rel=’nofollow’ I am hoping that -1- Our new domain has probably not received a manual penalty and is most likely now
received some sort of algorithmic penalty, and that as these “intermediate links” will soon disappear because I’m no longer doing the 302/307 from the old sight to the new. Do you think this is the case now or that I now have a new manual penalty place on the new
domain name.. I would very much appreciate any comments and/or suggestions as to what I should or can do to get this fixed. I need to still keep the old domain name as this address has already been printed on business cards many, many years ago.. Also on a side note some of the sub pages of the new root domain are still ranking very
well, it’s only the root domain that is now racking awfully.. Thanks,0