Should I worry about limiting link count on product listing/category pages?
-
I've noticed that my link count is high (165ish for some) on my category listing pages. I've been scouring my page to see if there's any way that I can reduce the link count without restricting functionality to the end user.
Each product listing on the category page has 5 links currently:
- A link to the product in the title
- A link to the product from the image
- An 'add to compare' link
- An 'add to cart' link
- An 'add to wishlist' link
When the customer chooses to show 30 products per page, the link tally goes off the scale. So I have two questions:
Firstly - is it appropriate to keep link count down in this scenario? To elaborate - is it just inevitable that product listing pages will have lots of links, and should I just assume that Google knows this and forget about these warnings.
Secondly - There are two links to the same page (the title and image links to the product page). Does SEOmoz include this in the link count, and more importantly, will Google take heed of these when deciding whether the page is too link-heavy?
-
I'm sorry Simon but you have reached the extent of my knowledge on the topic. Many things can only be answered by a Google employee...who can't answer because of a non-disclosure agreement.
Our sources of Google info are primarily Matt Cutts, official Google announcements and the occasional response shared by a Google employee on their forums.
I would agree with you that Google is quite capable or reading and executing javascript if they want to. I plan to perform a lot of various testing in the future and this sounds like a good candidate. In the mean time, I'd welcome any additional knowledge or experience others can share on this topic.
-
Thanks for the thorough response Ryan.
I've changed the add-to links to onclick=setLocation(). It seems logical to separate the 'functional' elements of a website from the content elements.
I've read some differing opinions about the use of Javascript instead of HTML, with some people suggesting that Google is just as capable of reading Javascript syntax. Whilst I'm sure that's true, it seems like a pretty rational way of distinguishing between content and function, and I'm curious to know whether Google makes that distinction, and whether it's considered standard best-practice to separate your linking methods in this way...?
-
Thanks Stephen, I've seen some of the debate, but I'm more curious what sort of strategy is best for ecommerce sites specifically, where many of the links serve as functions rather than content links (as Ryan mentions below). Any pointers?
-
Is it just inevitable that product listing pages will have lots of links, and should I just assume that Google knows this and forget about these warnings.
When you offer 165 links on the page, they all receive the same link juice, adjusted for where they appear on the page (i.e. header, footer, navigation) and with the idea the links at the top of the page are probably given more value then links lower on the page. To this end, it has nothing to do with what Google knows, and more with how you view the importance of your pages.
One possible idea. Can the "add to" links be presented in another format to where the links were not counted by Google? Add to cart, add to wishlist, add to compare don't seem to add any value to search engines. Perhaps they can be presented in a block together and not presented as links. As for the specific method, you can use encoded javascript or other options. I suggest speaking with a programmer on this topic.
Normally I don't endorse methods to hide links but these aren't links in the traditional sense. The user is not going anywhere but instead triggering an action. When a user clicks on a product link, that is a link in the traditional sense they are taken to a new page on your site. When a user clicks on one of the "add to" buttons, the user remains on the same page and an action is performed. If you do go with the javascript method, keep in mind any adjustments necessary for analytics tracking of actions.
There are two links to the same page (the title and image links to the product page). Does SEOmoz include this in the link count, and more importantly, will Google take heed of these when deciding whether the page is too link-heavy?
SEOmoz will count every link regardless of whether it is to the same target or not. No one knows for sure how Google handles the situation. Opinions on this topic vary. Over time I have seen theories going each way. We know for sure Google only associates anchor text with the first link to a target page. The weighting factors involved with multiple links are open to speculation.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seomoz legacy pages?
Hello, I am finding that I miss several of the old seomoz sections. The legacy tools in particular like the visual website comparison. Where is that now? Also, where is the ongoing list of the top 100 sites? So much was lost in the shift to MOZ, I hope some of the good old stuff is still available. Thank you, Nolan
Moz Pro | | QuietProgress0 -
Will moz crawl pages blocked by robots.txt and nofollow links?
i have over 2,000 temporary redirects in my campaign report redirects are mostly events like being redirected to a login page before showing the actual data im thinking of adding nofollow on the link so moz wont crawl the redirection to reduce the notification will this solve my problem?
Moz Pro | | WizardOfMoz0 -
Rate limit
Hi, We'd like to ask for an increased rate limit of 1 request every 5 seconds for the Mozscape API. Thanks
Moz Pro | | BLUEVISTA0 -
1 week has passed: Crawled pages still N/A
Roughly one week ago I went pro, and then I created a campaing for the smallish webshop that I'm employed at, however it doesn't seem to crawl. I've check our visitors log and while we find other bots such as google, bing, yandex and so fourth, seomoz bot hasn't been visible. Perhaps I'm looking for a normal useragent, ohwell, onwards. While I thought it might take time, as a small test I added a domain that I've owned for sometime but don't really use, that target site is only 17 pages, now this site was crawled almost within the hour, and I realised that our ~5000pages on the main campaing would take some time, but wouldn't the initial 250 pages be crawled by now? I should add, that I didn't add http:// to the original Campaing, but the one that got crawled I did. I cannot seem to change this myself inorder to spot if that's the problem or not. Anyone has any ideas, should I just wait or is there something I can activly do to force it to start rolling?
Moz Pro | | Hultin0 -
Too many on-page links
one of my SEOmoz pro campaigns has given me the warning: Too many on-page links and the page in question is my html sitemap. How do i resolve this because I obviously need my sitemap. How do i get around this?
Moz Pro | | CompleteOffice1 -
SEOMoz Keyword Limit
I am trying to add keywords to my SEOMoz campaign and it says I have reached my 300 KW limit eventhough I have only added 1 word! Any suggestions? uwt2V
Moz Pro | | theLotter0 -
Redirecting duplicate .asp pages??
Hi all, I have a bit of a problem with duplicate content on our website. The CMS has been creating identical duplicate pages depending on which menu route a user takes to get to a product (i.e. via the side menu button or the top menu bar). Anyway, the web design company we use are sorting it out going forward, and creating 301 redirects on the duplicate pages. My question is, some of the duplicates take two different forms. E.g. for the home page: www.<my domain="">.co.uk
Moz Pro | | gdavies09031977
www..<my domain="">.co.uk/index.html
www.<my domain="">.co.uk/index.asp</my></my></my> Now I understand the 'index.html' page should be redirected, but does the 'index.asp' need to be directed also? What makes this more confusing is when I run the SEOMoz diagnostics report (which brought my attention to the duplicate content issue in the first place - thanks SEOMoz), not all the .asp pages are identified as duplicates. For example, the above 'index.asp' page is identified as a duplicate, but 'contact-us.asp' is not highlighted as a duplicate to 'contact-us.html'? I'm a bit new to all this (I'm not a IT specialist), so any clarification anyone can give would be appreciated. Thanks, Gareth0 -
In page analysis
Hello, Seomoz did a crawl of my whole page, but under in page analysis grade rankings there have been no crawls? how can I initiate one?
Moz Pro | | AdamGymyGym0