Navigation for Users vs Spiders
-
We're creating a new global site nav that provides a great user experience, but may be less than ideal for the search engines. The user selects an item from category A, and is then presented options to choose from in category B, and then chooses a specific product. The user does not encounter any actual "links" until they choose the specific product.
The search engines won't see this navigation path due to the way that the navigation is coded. They're unable to choose an item from A, so they can't get to B, and therefore cannot get to C, which is the actual product page.
We'd like to create an alternative nav for the browsers, so that they can crawl the category pages for A and B, as well as the specific product pages (C).
This alternative nav would be displayed if the user does not have javascript enabled. Otherwise, the navigation described above will be shown to the user.
Moving forward, the navigation that the user sees may be different from what is shown to the search engine, based on user preferences (ie they may only see some of the categories in the nav, while the search engines will see links to all category/product pages).
I know that, as a general rule, it's important that the search engines see the same thing that the user sees. Does the strategy outlined above put us at risk for penalties?
-
Here is the Google’s guidelines for developers that how they can make their AJAX code crawlable.... https://support.google.com/webmasters/answer/174992?hl=en
I guess you should pretty much focus on your user’s experience and I believe Google crawlers can easy crawls your AJAX and JS codes...
Hope this helps!
-
Same response AJAX is a javascript method to get content from another page. Crawlers have no issues indexing that. Now a days, most BIG sites use AJAX, like the ones with infinite scroll.
The way they do it is: they put the link to the next page (that users don't see since you hide the "Next" via css) and both crawlers and users can navigate the site just fine. In your case, you can put links into each submenu option too, that way you will help both users and crawlers.
-
Sorry, I should have clarified, the navigation utilized AJAX, so the links don't actually appear anywhere in the source. We do have breadcrumbs on the product pages. Thanks!
-
Search engines are already good executing Javascript, so they WILL see those links too. I would suggest only the "user" navigation and add some bread crumbs in each product (the path the user followed to reach that product) so crawler and users can also navigate the site by category.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any SEO impact to using "www" vs. non-"www" preferred domain name?
My client has been using "www" with his domain and before I took over, has used it in marketing etc. I typically don't use "www" in my wordpress setup, and set non-www as the preferred domain in google analytics and google search console. Does it make any difference? Especially when www resolves to non-www? I appreciate some guidance with this.
White Hat / Black Hat SEO | | chill9860 -
Canonicalize vs Link Juice
I recently wrote (but have not published) a very comprehensive original article for my new website (which has pretty much no domain authority). I've been talking to the publisher of a very high Domain Authority site and they are interested in publishing it. The article will include 2-3 follow backlinks to my website. My question is should I: Repost the article in my own site and then request a "rel=canonical" from the high authority site Not re-post the article on my own site and just collect the link juice from the high authority site Which would be better for my overall SEO? Assume in case 1) that the high authority site would add a rel=canonical if I asked for it.
White Hat / Black Hat SEO | | wlingke20 -
Hacked site vs No site
So I have this website that got hacked with cloaking and Google has labeled it as such in the SERPs. With due reason of coarse. My question is I am going to relaunch an entirely new redesigned website in less than 30 days, do I pull the hacked site down until then or leave it up? Which option is better?
White Hat / Black Hat SEO | | Rich_Coffman0 -
Keyword Phrase vs. separate keywords - Title Tag best practices
Hello, What is your opinion about when to use a keyword phrase vs. 2 keywords, separated by a comma, in the title tag? For example, on this page, the title could be either: NLP Hypnosis, Language Patterns | Nlpca.com or NLP and Hypnosis Including Language Patterns | Nlpca.com Which do you guys think is best with respect to rankings, updates, and future updates?
White Hat / Black Hat SEO | | BobGW0 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
Duplicate user reviews from hotel based database?
Hello, Just got a new client who has a hotel comparison site, the problem is the reviews and the hotel data is all pulled in from a database, which is shared and used by other website owners. This obviously brings up the issue for duplicate content and panda. I read this post by Dr Pete: http://www.seomoz.org/blog/fat-pandas-and-thin-content but am unsure what steps to take. Any feedback would be much appreciated. Its about 200,000 pages. Thanks Shehzad
White Hat / Black Hat SEO | | shehzad0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640