Navigation for search
-
We are getting ready to launch a site that has great navigation for users, but it is not so great for search engines. As long as we are ethical about it, does anyone see a downside to detecting a bot user agent and displaying different nav to it? I suppose some could consider it cloaking, but I noticed amazon uses this strategy and they don't seem to be getting a big penalty lol. We are not going to do anything shady with it, just offer the bot a different way to access our content. Any thoughts?
-
I would never recommend altering your site in any way specifically for a bot. Google has repeatedly stated that would be seen as cloaking and you definitely run the risk of a penalty. It can happen any time. Do you really want that cloud over your head?
Even if Amazon is able to do it, you are not Amazon. Despite any statement to the contrary, a huge site like Amazon can do things that you or I may not be able to do. We can debate the fairness of it, but we are not on a level playing field with them.
You seem to really want to do this, so feel free. But x months from now you will likely make a post "My rankings dropped and I can't figure out why", and then someone will notice the setting and that will be the answer.
-
Could you please share the site in question? we might be able to give a much more insightful thought
-
Thanks for your suggestion Alex!
-
All the content will be accessible through the sitemap which will be displayed in both versions (the user and the bot) version of the site. The only difference is the navigation will change from one version to another. It could be perceived as a gray area, even though we really are just making it more friendly to bots not trying to be deceptive at all.
-
And that should be okay if they see the content is the same between your normal version and the spider version, just a difference in presentation. It becomes a problem when you start serving drastically different content.
-
How about adding an indexable version of the navigation and then hiding it with CSS? The crawlers would see that. I haven't thought about whether that'd be a good idea, just a quick thought.
-
We usually operate under the principal that would this be ok if it were manually reviewed? In our plan, I am pretty confident that it would pass. We are planning on making what is displayed to bots very basic text link navigation. Our XML and html sitemaps will correspond well with the bot navigation (so to show G we really are not trying to hide anything). And yes we could make the nav more search friendly, but one of our main goals is to have the site look as clean as possible. As far as I can tell Amazon uses this method to avoid duplicate content issues and make their site generally more crawlable which is why I think it is on the table as an option.
-
Depending on how much you're changing, I think it could be considered cloaking and end up getting you in trouble. You'll get looked at any time you're sniffing for bots and serving up different content. Are there other ways you could make the navigation index-able?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to best handle search landing pages - that don't exist
I have quite a bit of blog information that can be searched, which results in "pages" that don't actually live anywhere. These are scanned by Moz and appear as poor page quality for speed, etc. How do I get the service to either ignore all of these or is there a way to treat them as a real page with content? As there are quite a few generated over time, I'd like to be able to capture them somehow. Thanks.
On-Page Optimization | | amac70 -
What should I be shooting for for search visibility percentage?
Realistically - what's a "good" search visibility score? I'm working on a site that has been around less than a year. We are doing 3 blogs a week with carefully selected keywords. I know it will take time and lots of SEO work - but I'm interested in any ideas on what I should shoot for. Thanks for any thoughts! (Also not a local listing or anything - national search.)
On-Page Optimization | | mm19804 -
New website showing old domain titles in search
Hello Moz, We have recently built a new website http://www.hegroup.org.uk/ The new site has the domain for one of the clients old sites pointing to it - heartofmersey.org.uk. When we check the SEO index (site:hegroup.org.uk) for the new site, most of of the indexed items are using the old 'Heart of Mersey' title in the index although these do redirect to the new site. See below. Heart of Mersey <cite class="_Rm">www.hegroup.org.uk/</cite>Jessica Bell · Andrew Bennett · Nicola Calder · Matt Donnelly · Alexandra Holt · Robin Ireland · Magdalena Kolka · Alison Gradwell · Matthew Philpott · Trustees. Not sure how to resolve this issue. Any suggestions Thanks Ian
On-Page Optimization | | Substance-create0 -
Structuring navigation for maximum effect
Hi, I am working with a client (in the property niche) who has 200+ links on each page of their site mainly due to an extensive navigation menu. They have good domain authority (although some competitors have a lot better) and some excellent links from some fantastic domains but the keywords just aren’t moving. (Sidenote: most links point to the home page with some going to property detail pages not location pages which is where I’d like people to be landing). I am reviewing the site structure and other technical aspects and have some questions regarding how the navigation is structured. Firstly is 200+ links an ok number to have? Everything I read points to 100 being a magic number to aim for. Secondly, the site navigation menu contains a list of locations. The first tier being country, the second tier drops down to list the regions within that country, then a third tier drop down appears to list the towns and cities in those regions. So from any page in the site you can drill down to town/city locations. (Sidenote: I have run Hotjar on the site which shows most people are using the search facility not the navigation menu to search) Is this style of navigation ok or does it dilute the link authority/pagerank/juice being past to each page? Would a better option be to have the first and second tier in the drop down then the third level town navigation to appear in the sidebar at page level in the appropriate sections? What effect would such a change have on rankings?
On-Page Optimization | | caravan0 -
How to rank in Google during domain name search?
We have received one requirement for this website. http://www.coldcasebeer.com/ We would like to rank in Google during domain name search. If we are going to search following search terms in Google then We are not able to see this domain on first page of Google search result. cold case beer coldcasebeer We have done quick research on this issue. And, We have decided to implement following tasks to make it happen.But, We are quite excited to read certain inputs on this question from experts! 1. Upload default Robots.txt file 2. Verify website on Google webmaster tools 3. Set up USA region on Google webmaster tools 4. Submit Google crawl request on Google webmaster tools 5. Title tag and Meta description optimization for all pages (There are 5 pages only on website) 6. Audit Google local listing 7. Develop quality links with domain name
On-Page Optimization | | CommercePundit0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Errors when checking W3C HTML after added Google Custom Search
hello, I have added google custome search to my website, and then check with W3C HTML, it report many error.
On-Page Optimization | | JohnHuynh
eg: there is no attribute "enableHistory" <gcse:searchbox-only enablehistory="true" autocompletemaxcompletions="5" au…<br="">or there is no attribute "resultsUrl" and so on ...</gcse:searchbox-only> Has anyone face with this problem, I don't know how to fix it. Please help!0 -
Does Google Bias Against Homepages for Search Queries that are a Question?
If you are trying to rank for a keyword phrase that is a question (e.g. how to ___) does it matter if your page is the home page of your site or an article on your site? I suspect that Google would treat question search queries differently and would show preference for articles over a home page in most cases. An article would be more likely to satisfy the searcher by providing the answer to their question, whereas the home page usually doesn't provide specific answers. I looked at one keyword phrase that is a question and only 1 of the first 20 results was a home page. Any thoughts or experience with this?
On-Page Optimization | | Charlessipe0