Too many links on hompage - what to do with overloaded main navigation
-
Hi there,
on my website (http://dealcity.de) I aggregate group shopping deals from Germany.
To obtain a good user experience there is a large main navigation in the header where users can choose the city they are interested in.
The problem is, that the main navigation has 220 links to all cities. Of course this is way to much on a fairly new site like this to have the link juice flown to the sites that are important to us.
What should I do with the main navigation? Is there any way to remove these links from the linkgraph but keep the current user experience?
Best Regards
Markus -
If it's purely from a user point of view and you don't need Google to crawl the links I have two suggestions:
1 - Do it in JavaScript or some other language that Google can't click, then there is no way Google can see or use the links
2- Alternatively you could always just stick a no-follow on all the navigation links
Not sure if either of the above help, kind of confused at your needs to be honest. If you truely build it in the best interest for the users, the number of links shouldnt be an issue. If it's all city links, why not try grouping them into the country, then into city.
Craig
-
I worry about the AJAX option here. Are there any benefits other than for the search engines? In my opinion, if there is an error with the AJAX and it doesn't load, this is very very bad for the person using it.
I seriously doubt that Google can't decipher between a bunch of primary navigation links which are necessary and spammy links.
-
Another solution would be geo targeting users and pre-setting their location with option to adjust. Use dropdowns or ajax to select cities. I find the top nav a bit strange - not sure if I would use it. Another option would be a prominent search field to give users quick access to their city's deals.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we rel=nofollow these links ?
On our website, we have a section of free to low-cost tools that could help small business increase their productivity without spending big bucks. For example, this is the page for online collaboration tools: http://www.bdc.ca/EN/solutions/smart_tech/tech_advice/free_low_cost_applications/Pages/online_collaboration_tools.aspx None of the company pay anything to be on these list. We actually do quite a lot of research to chose which should be listed there and which should not. Recently, one of the company in our lists asked us to add rel=nofollow to the link to their website because they add been targeted by a manual action on Google and want their link profile to be as clean as possible (probably too clean). My question is : Should we add rel=nofollow to all these links ? Thanks, Jean-François Monfette
Technical SEO | | jfmonfette0 -
Better to Remove Toxic/Low Quality Links Before Building New High Quality Links?
Recently an SEO audit from a reputable SEO firm identified almost 50% of the incoming links to my site as toxic, 40% suspicious and 5% of good quality. The SEO firm believes it imperative to remove links from the toxic domains. Should I remove toxic links before building new one? Or should we first work on building new links before removing the toxic ones? My site only has 442 subdomains with links pointing to it. I am concerned that there may be a drop in ranking if links from the toxic domains are removed before new quality ones are in place. For a bit of background my site has a MOZ Domain authority of 27, a Moz page authority of 38. It receives about 4,000 unique visitors per month through organic search. About 150 subdomains that link to my site have a Majestic SEO citation flow of zero and a Majestic SEO trust flow of zero. They are pretty low quality. However I don't know if I am better off removing them first or building new quality links before I disavow more than a third of the links to the site. Any ideas? Thanks,
Technical SEO | | Kingalan1
Alan0 -
Thousands of external links
My site has supposedly over 4,000 external links from it. Is there a good piece of software that could scrape my site that tell me from which page all the links originate from and where they're all going? I'm surprised that the number is this high because our entire site is only a few hundred pages. Here's the site XXXkidecalsXXX.com (just remove the XXX for the URL.) If you want to weigh in on any other issues you see with the site, I'd be happy for any suggestions in general.
Technical SEO | | Santaur0 -
Do I need a link to my sitemap?
I have a very large sitemap. I submit it to both Google and Bing, but do I need a link to it? If someone went there it would probably lock their browser. Is there any danger of not having a link if I submit it to Google and Bing?
Technical SEO | | EcommerceSite0 -
Find broken links in Excel?
Hello, I have a large list of URL's in an excel sheet and I am looking for a way to check them for 404 errors. Please help! Adam
Technical SEO | | digitalops0 -
Does Yelp pass link juice?
This is probably a profoundly obvious question, but I can't seem to find an explicit answer on the internet, so I'll ask it here: Yelp's links out to local business websites are not nofollow'd, but they go through a javascript-based redirect. My understanding is that javascript redirected links do not pass link juice, so a link from a yelp profile will not directly impact my page authority; however, it looks like yelp does use nofollow judiciously for internal links, so I don't understand why they would allow follow for these "useless" outbound links. Do yelp's javascript-redirected links pass link juice?
Technical SEO | | tvkiley0 -
Javascript funtion as link? Why not show up?
We joined our Chamber of Commerce for the "link" as much as anything. After 9 months of having a link from our local chamber it has never showed up anywhere. You can see the link on my Chambers page, and you can click on it and it works. But it does not show up anywhere else....Not in any backlink checker, not in SEOmoz, not in Google Webmaster Tools. When I hover over our link on their page I see "javascript:encodeclick........my url" Is this link worth anything? What is a javascriptencodeclick? Does Google know it exists and give me credit for it? Our Chamber is clueless... they hire someone to do their website. Their webmasters response to my question was: Hi, These links look like this because this is just the way our system parses URLs that are entered into the membership directory so they can be clickable when displayed in the lister. These links will not have a negative effect on Google or SEO indexing purposes if that is what you are concerned about. They are not encoded or encrypted, this just happens to be the name of the Javascript function.
Technical SEO | | SCyardman0 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0