Looking for feedback about "look-ahead" navigation
-
Our company has been creating websites where the navigation is developed in such a way as to allow the visitor to get a preview of the image and/or content on that is on the page. Here are two websites that use this technology:
http://www.uniquepadprinting.com/
http://www.empathia.com/ (On this site, the previews are only available if you click on "Whole", "Productive" or "Safe" at the top of the page.I'm looking for feedback such as:
- What do you call this type of navigation (We call it look-ahead, but I can't find much info that term on the web)
- Have you experienced any issues with this type of navigation?
- Do you have any recommendations on it?
Some of the things we've seen are:
- It adds the same content to every page of the website
- It creates a lot of internal links
- It can create a lot of code on pages
- It can slow page-load times
-
I think developers think too much about how their code impacts search results. It's the visual content that is most important to google. As long as it can crawl it, you're generally in good shape with the code.
-
Hi Chris,
Thanks for your response. I'm glad you like the look; I agree it can be a cool addition.However, I was surprised that you stated the sites don't add much code. If you take a closer look at the code of any of the interior pages of Unique Pad Printing, for example, you will see that the navigation takes up over 200 lines of code. Original content doesn't appear until you get to line 411 of the code.
Compare this to another site, such as http://www.industrialvacuum.com. The navigation only takes 30 lines of code, and original content appears in line 124.
-
I'm not sure it does any of those things. It does create additional code but not much. I think it's kind of cool. Would be nice if the display block were clickable, though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
SEO Troubleshooting? Not ranking in Top 50 for "easy" keyword
Hi there, First of all, thank you in advance to whoever steps in to help me with this issue! So, I have a new site (launched December 2016) in the investing space and have been able to get page 1 rankings on some of my pages. One of my best ranking pages is for the phrase "what is xiv". The Keyword Explorer has this phrase at a 21 difficulty. My page for this keyword is https://www.projectoption.com/what-is-xiv/. The post reached the first page almost immediately after being published, though I know I shouldn't expect this for other keywords of similar difficulty. Here is my problem: I just wrote a comprehensive guide (8,000+ words) on a different keyword phrase: "vertical spreads." The Keyword Explorer has this phrase at a 25 difficulty. My page for this topic: https://www.projectoption.com/vertical-spreads-explained/. However, the page is nowhere to be found in organic Google rankings (not in top 50), and the page has been live for a few weeks now. I've done my best at optimizing the post, but something leads me to believe there are some issues that are beyond my SEO knowledge. For example, maybe the post is too long, and Google can't figure out what the page is about. Any insights would be greatly appreciated. Thank you in advance for your time! -Chris
Technical SEO | | cbutler222930 -
How to explain "No Return Tags" Error from non-existing page?
In the Search Console of our Google Webmaster account we see 3 "no return tags" errors. The attached screenshot shows the detail of one of these errors. I know that annotations must be confirmed from the pages they are pointing to. If page A links to page B, page B must link back to page A, otherwise the annotations may not be interpreted correctly. However, the originating URL (/#!/public/tutorial/website/joomla) doesn't exist anymore. How could these errors still show up? Screenshot%202016-07-11%2017.36.27.png?dl=0
Technical SEO | | Maximuxxx0 -
Combining variants of "last modified", cache-duration etc
Hiya, As you know, you can specify the date of the last change of a document in various places, for example the sitemap, the http-header, ETag and also add an "expected" change, for example Cache-Duration via header/htaccess (or even the changefreq in the sitemap). Is it advisable or rather detrimental to use multiple variants that essentially tell browser/search engines the same thing? I.e. should I send a lastmod header AND ETag AND maybe something else? Should I send a cache duration at all if I send a lastmod? (Assume that I can keep them correct and consistent as the data for each will come from the very same place.) Also: Are there any clear recommendations on what change-indicating method should be used? Thanks for your answers! Nico
Technical SEO | | netzkern_AG0 -
Google caching the "cookie law message"
Hello! So i've been looking at the cached text version of our website. (Google Eyes is a great add on for this) One thing I've noticed is that, Google caches our EU Cookie Law message. The message appears on the top of the page and Google is caching this. The message is enclosed within and but it still is being cached. I'm going to ask the development mean to move the message at the bottom of the page and fix the position, but reviewing other websites with cookie messages, Google isn't caching them in their text only versions. Any tips or advice?
Technical SEO | | Bio-RadAbs0 -
Mobile Wordpress Plugins - keep look the same
Hello... I'm looking for some advice on Wordpress plugins to make my site optimized for mobile visitors. There are tons of plugins to choose from, but I'm looking for something that will keep the look of the site the same, just optimized to perform and view better on mobile devices.. Does anyone have any advice on good Wordpress plugins that will help accomplish what I'm looking for? Thanks
Technical SEO | | Prime850 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
If you only want your home page to rank, can you use rel="canonical" on all your other pages?
If you have a lot of pages with 1 or 2 inbound links, what would be the effect of using rel="canonical" to point all those pages to the home page? Would it boost the rankings of the home page? As I understand it, your long-tail keyword traffic would start landing on the home page instead of finding what they were looking for. That would be bad, but might be worth it.
Technical SEO | | watchcases0