Main menu duplication
-
I am working on a site that has just gone through a migration to Shopify at the very same time as Google did an update in October. So problems from day 1.
All main menu categories have subsequently over the past 6 weeks fallen off a cliff. All aspects of the site have been reviewed in terms of technical, link profile and on-page, with the site in better shape than several ranking competitors.
One issue that i'd like some feedback on is the main menu which has 4 iterations in the source.
- desktop
- desktop (sticky)
- mobile
- mobile (sticky - appears as a second desktop sticky but I assume for mobile)
These items that are "duplicated" menus are the top level menu items only. The rest of the nested menu items are included within the last mobile menu option.
So
- desktop menu in source doesn't include any of the sub-menu items, the mobile version carries all these
- there are 4 versions of the top level main menu items in source
Should I be concerned? Considering we have significant issues should this be cleaned up?
-
A couple of other issues were uncovered with certain collections browser rendering. Cleaned up menu duplication and these. Monitoring.
-
You are right to be concerned and many in the SEO community don't really feel that Shopify has 'nailed' SEO yet. It started as a slightly nicer version of Wix where you could build your own site pretty easily but obviously they handle a lot of the eCommerce aspects as well (thus it's very attractive to business owners, sadly it's not great for SEO)
The community is expanding and the number of plugins and add-ons for Shopify is broadening. The problem is, many developers working on the Shopify platform don't have too much SEO experience (at least, that has been my experience of the Shopify community)
If you are finding that certain items are missing from the 'base' (non modified) source code, that is a concern. Google can technically crawl generated content and links (which are rendered client site), but that required headless browsers and client-side rendering. On average that takes 10x longer than basic source-scraping. Google's mission is to 'index the web', so although they have this new technology and functionality they wouldn't arbitrarily decide to take a 10x efficiency hit across all indexation (that would be nutty and would go against their prime directive)
Rendered crawling is deployed by Google for popular web pages. When it is used, it is not used with the same frequency as basic crawling - and not everyone gets that special treatment!
If you're not Santander or Coca Cola, you should be thinking about how you can help Google rather than how Google will "certainly use their latest technologies to help me, a small to medium business owner - at any expense!" - it just won't happen (sorry!)
The Shopify community is commerce and design led. One thing they are really bad at, is latching onto one-off isolated comments from Google (such as "we can crawl JavaScript now!") and then applying that to everything without testing it first in iterations. The fact is, sites that perform more server-side rendering do still perform better than sites which rely too heavily on client-side rendering (especially as that drastically impacts page-loading speeds and burdens the end user)
If I was finding lots of critical stuff that didn't appear in the base (non-modified) source code and my site wasn't a household name, I'd be really - really concerned!
I am sure that the right Shopify designers and developers could sort it out for you, but it may be costly. Especially as devs in that community won't believe you that it's necessarry, and will fire loads of posts to you (from Google) stating that what they have already done is fine. Comments from the horse's mouth are useful, but not without greater context
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving from http to https: image duplicate issue?
Hello everyone, We have recently moved our entire website virtualsheetmusic.com from http:// to https:// and now we are facing a question about images. Here is the deal: All webpages URLs are properly redirected to their corresponding https if they are called from former http links. Whereas, due to compatibility issues, all images URLs can be called either via http or https, so that any of the following URLs work without any redirect: http://www.virtualsheetmusic.com/images/icons/ResponsiveLogo.png https://www.virtualsheetmusic.com/images/icons/ResponsiveLogo.png Please note though that all internal links are relative and not absolute. So, my question is: Can that be a problem from the SEO stand point? In particular: We have thousands of images indexed on Google, mostly images related to our digital sheet music preview image files, and many of them are ranking pretty well in the image pack search results. Could this change be detrimental in some way? Or doesn't make any difference in the eyes of Google? As I wrote above, all internal links are relative, so an image tag like this one: Hasn't changed at all, it is just loaded in a https context. I'll wait for your thoughts on this. Thank you in advance!
Intermediate & Advanced SEO | | fablau0 -
Blank Cart Pages Showing as Duplicate, HELP
Hi Everyone, I'm seeing a bunch of URLs that look something like this [ domain.com/cart?add&id_product=42&token=776d4a08721f3d8c920e287248797547] showing as duplicate content in my Moz crawls. I think these are just blank pages for the most part. Is there anything to be concerned with here? Is there a way to clean this up? Thanks! Ricky
Intermediate & Advanced SEO | | RickyShockley0 -
Should we go after this main keyword?
Hello, We run an online store. The main content keyword for our niche is very competitive, but if I was going to go look up information and I was one of our customers, that is exactly what I would type in - this main general keyword. We have an expert in the field to write it and plenty of time. Although the main keyword is competitive, there are many many subkeywords that are a lot less competitive that would be answered in the article. It's tough to find good topics in this niche. We're thinking about doing a "Complete Guide to X". We would have far less backlinks and authority for about half of the 30 keywords it will cover than our main competitors. Should we do this and spend the next couple of years working on it, or should we perhaps target a smaller topic? Any advice is appreciated.
Intermediate & Advanced SEO | | BobGW0 -
How do I use public content without being penalized for duplication?
The NHTSA produces a list of all recalls for automobiles. In their "terms of use" it states that the information can be copied. I want to add that to our site, so there is an up-to-date list for our audience to see. However, I'm just copying and pasting. I'm allowed to according to NHTSA, but google will probably flag it right? Is there a way to do this without being penalized? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup1 -
Duplicate pages with http and https
Hi all, We changed the payment part of our site to https from http a while ago. However once on the https pages, all the footer and header links are relative URLs, so once users have reached the payment pages and then re-navigate back to other pages in our website they stay on https. The build up of this happening has led to Google indexing all our pages in https (something we did not want to happen), and now we are in the situation where our homepage listing on Google is https rather than http. We would prefer the organic listings to be http (rather than https) and having read lots on this (included the great posts on the moz (still feels odd not refering to it as seomoz!) blog around this subject), possible solutions include redirects or a canoncial tags. My additional questions around these options are: 1. We already have 2 redirects on some pages (long story), will another one negatively impact our rankings? 2. Is a canonical a strong enough hint to Google to stop Google indexing the https versions of these page to the extent that out http pages will appear in natural listings again? If anyone has any other suggestions or other ideas of how to address this issue, that would be great! Thanks 🙂 Diana
Intermediate & Advanced SEO | | Diana.varbanescu0 -
Duplicate peices of content on multiple pages - is this a problem
I have a couple of WordPress clients with the same issue but caused in different ways: 1. The Slash WP theme which is a portfolio theme, involves setting up multiple excerpts of content that can then be added to multiple pages. So although the pages themselves are not identical, there are the same snippets of content appearing on multiple pages 2. A WP blog which has multiple categories and/or tags for each post, effectively ends up with many pages showing duplicate excerpts of content. My view has always been to noindex these pages (via Yoast), but was advised recently not to. In both these cases, even though the pages are not identical, do you think this duplicate content across multiple pages could cause an issue? All thoughts appreciated
Intermediate & Advanced SEO | | Chammy0 -
Duplicate description problem in Wordpress.
Webmaster tools is flagging up duplicate descriptions for the page http://www.musicliveuk.com/live-acts. The page is one page in the wordpress page editor and the web designer set it up so that I can add new live acts from a seperate page editor on the left menu and that feeds into the page 'live-acts'. (it says under template 'live-acts-feed'. The problem is as I add more acts it creates new url's eg http://www.musicliveuk.com/live-acts/page/2 and http://www.musicliveuk.com/live-acts/page/3 etc... I use the all in one SEO pack and webmaster tools tells me that page 2/3/4/ etc all have the same description. How can I overcome this? I can't write new descriptions for each page as the all in one SEO pack will only allow me to enter one for the page 'live-acts'.
Intermediate & Advanced SEO | | SamCUK0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0