How many inner links on one page?
-
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do?
We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow?
We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site?
Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
-
But rel=canonical is offered only for search engines to learn whether the page is a duplicate or an original.
Maybe there's someone else that could tell their experience about this topic?
-
There is always a chance it could be seen as cloaking, but I think this instance is a legitimate reason to do it. You aren't blatantly trying to manipulate the search engines for your gain. You are trying to help the users first and please the search engines second, so I think it should work fine.
-
We thought about using rel=next/prev and rel=canonical. But would html paginated webpages showing that all the same products are on category main page (where Google bot can't expand to see them) would lie to Google.
And it would be like showing that all my stuff is one some page that bots can't see them. Isn't it called cloaking?
-
I would suggest using Pagination. If you were able to allow the visitor to filter by best selling, most reviewed, newest, lowest price, etc, that might solve the potential problem of having multiple pages if you do decide to use pagination.
However, if you are planning to use Ajax to show the paginated results, that could be a good solution, but the pagination navigation would somehow have to be in regular HTML so the search engines could still spider it.
I'd also recommend using rel=canonical rel=prev rel=next if you do use pagination.
Scott O.
P.S. If you feel I have been helpful, please consider marking this as "Good Answer".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Can pages compete with each other? Inbound links & domain authority, How to determine problem areas?
Heyy, I'm having some pretty big SEO issues. 😞 We have had some drops in our ranking. We're 5th page or worse depending on location for a few of our keywords that we used to rank well for. There are all sorts of random non relevant sites outranking us for the term "stickley" and "stickley furniture" One thing I noticed is that we are ranking for a different page for each keyphrase. Our home page is ranking for "Stickley" and our stickley page is ranking for "Stickley Furniture" Is this normal? I guess Google is just picking what it see's as what's more relevant. Is it possible that these two pages are "competing?" Do similar phrases linking to different pages cause pages to "fight" or unevenly disperse link juice? I'm having trouble knowing which page I should send inbound links to since Google seems to be linking similar keywords to different pages. How much should I stress about which pages I receive links on? Is it true that any inbound link to a site site will help increase its overall domain authority and overall SEO? What should I be focusing on? I've added 301 redirects for non WWW as well as tried to make the pages well optimized for SEO. Should I just add more related content to the pages? I know backlinks are important but I'm having a really hard time figuring out how to get links that aren't just spammy forum post footers or junk directory submissions. The thing that bothers me is we were ranking well and then suddenly are way back. We have never done any black hat SEO of any sort. I feel a bit stuck and confused at the moment 😞 Thanks in advance for any help!
Intermediate & Advanced SEO | | SheffieldMarketing
-Amy0 -
Does google detect all updated page with new links
as paid links? Example: A PR 4 page updates the page a year later with new links. Does Google discredit these links as being fishy?
Intermediate & Advanced SEO | | imageworks-2612900 -
Too many on page links - product pages
Some of the pages on my client's website have too many on page links because they have lists of all their products. Is there anything I should/could do about this?
Intermediate & Advanced SEO | | AlightAnalytics0 -
Deep Page is Ranking for Main Keyword, But I Want the Home Page to Rank
A deep page is ranking for a competitive and essential keyword, I'd like the home page to rank. The main reasons are probably: This specific page is optimized for just that keyword. Contains keyword in URL I've optimized the home page for this keyword as much as possible without sacrificing the integrity of the home page and the other keywords I need to maintain. My main question is: If I use a 301 redirect on this deep page to the home page, am I risking my current ranking, or will my home page replace it on the SERPs? Thanks so much in advance!
Intermediate & Advanced SEO | | ClarityVentures0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0