Navigation for Users vs Spiders
-
We're creating a new global site nav that provides a great user experience, but may be less than ideal for the search engines. The user selects an item from category A, and is then presented options to choose from in category B, and then chooses a specific product. The user does not encounter any actual "links" until they choose the specific product.
The search engines won't see this navigation path due to the way that the navigation is coded. They're unable to choose an item from A, so they can't get to B, and therefore cannot get to C, which is the actual product page.
We'd like to create an alternative nav for the browsers, so that they can crawl the category pages for A and B, as well as the specific product pages (C).
This alternative nav would be displayed if the user does not have javascript enabled. Otherwise, the navigation described above will be shown to the user.
Moving forward, the navigation that the user sees may be different from what is shown to the search engine, based on user preferences (ie they may only see some of the categories in the nav, while the search engines will see links to all category/product pages).
I know that, as a general rule, it's important that the search engines see the same thing that the user sees. Does the strategy outlined above put us at risk for penalties?
-
Here is the Google’s guidelines for developers that how they can make their AJAX code crawlable.... https://support.google.com/webmasters/answer/174992?hl=en
I guess you should pretty much focus on your user’s experience and I believe Google crawlers can easy crawls your AJAX and JS codes...
Hope this helps!
-
Same response AJAX is a javascript method to get content from another page. Crawlers have no issues indexing that. Now a days, most BIG sites use AJAX, like the ones with infinite scroll.
The way they do it is: they put the link to the next page (that users don't see since you hide the "Next" via css) and both crawlers and users can navigate the site just fine. In your case, you can put links into each submenu option too, that way you will help both users and crawlers.
-
Sorry, I should have clarified, the navigation utilized AJAX, so the links don't actually appear anywhere in the source. We do have breadcrumbs on the product pages. Thanks!
-
Search engines are already good executing Javascript, so they WILL see those links too. I would suggest only the "user" navigation and add some bread crumbs in each product (the path the user followed to reach that product) so crawler and users can also navigate the site by category.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question RE: Links in Headers, Footers, Content, and Navigation
This question is regarding this Whiteboard Friday from October 2017 (https://moz.com/blog/links-headers-footers-navigation-impact-seo). Sorry that I am a little late to the party, but I wanted to see if someone could help out. So, in theory, if header links matter less than in-content links, and links lower on the page have their anchor text value stripped from them, is there any point of linking to an asset in the content that is also in the header other than for user experience (which I understand should be paramount)? Just want to be clear.Also, if in-content links are better than header links, than hypothetically an industry would want to find ways to organically link to landing pages rather than including that landing page in the header, no? Again, this is just for a Google link equity perspective, not a user experience perspective, just trying to wrap my head around the lesson. links-headers-footers-navigation-impact-seo
White Hat / Black Hat SEO | | 3VE0 -
Google spider
If someone provide 1 or more cent discount to our customers who put up a link on their site, and wanted to actually show the referral discount in their shopping cart for that customer, can Google see that and realize they are providing a discount for a link? Can Google see what's displayed in our their web application - like in the upload, shopping cart and complete transaction pages?
White Hat / Black Hat SEO | | K_Monestel0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Hacked site vs No site
So I have this website that got hacked with cloaking and Google has labeled it as such in the SERPs. With due reason of coarse. My question is I am going to relaunch an entirely new redesigned website in less than 30 days, do I pull the hacked site down until then or leave it up? Which option is better?
White Hat / Black Hat SEO | | Rich_Coffman0 -
Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company. As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking. I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot. What do you think?
White Hat / Black Hat SEO | | eTundra0 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0 -
Link Building: High Ranking Site vs. Relevancy
Hello, When link building, is it acceptable to link with a site that has high authority but has minimal relevancy to our site? For example, if we sell nutritional products and the link exchange would be with a site that relates to free coupons, would that work? Also, if we are publishing articles on other sites, should we also publish them on our own site? Should we add "nofollow" if we publish them in our site?
White Hat / Black Hat SEO | | odegi0