100 + links on a scrolling page
-
Can you add more than 100 links on your webpage If you have a webpage that adds more content from a database as a visitor scrolls down the page.
If you look at the page source the 100 + links do not show up, only the first 20 links. As you scroll down it adds more content and links to the bottom of the page so its a continuos flowing page if you keep scrolling down.
Just wanted to know how the 100 links maximum fits into this scenario ?
-
Everyone says don't dilute the PR, but what is the approximate numbers?
As for now we have an e-commerce site, with many products. We want our page to be easy slide (as on Facebook scroll down), and it would put like 5000+ links on one page, would this dilute?
-
Have a look at this article from matt Cutts
http://www.mattcutts.com/blog/how-many-links-per-page/
The original reason for the 100 link recommendation was due to technical limitations but this is no longer the case, 100 is still a general rule of thumb though. There are certainly times when you need to have more than 100 links, look at Google Plus for instance where the do what you describe above and load further content as the user scrolls down.
Keep in mind though that the amount of value each link can pass on depends on the total number of links on your page so you still wouldn't want to go crazy and link to every page on your website from each page. You mention that all links will not be included in the source code though so you shouldn't have the problem of over diluting the link value. Just make sure that all the links that appear when scrolling are included in a sitemap so they can be crawled.
-
Google (Matt Cutts) talks about this and saying that there is no set number. He does say that you want to be careful not to make it spammy. I had a similar issue that hit me with Panda and since then I have fixed the issue. I believe that link placement is important and how you place them on your page effects this as well.
For instance
Link1
Link2
Link3
Link4
Link5
is better than
link1,link2,link3,link4,link5
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why rankings dropped from 2 page to 8th page and no penalization?
Dear Sirs, a client of mine for more than 7 years used to have his home page (www.egrecia.es) between 1st and 2nd page in the Google Serps and suddenly went down to 8 page. The keyword in question is "Viajes a Grecia". It has a good link profile as we have built links in good newspapers from Spain, and according to Moz it has a 99% on-page optimization for that keyword, why why why ??? What could I do to solve this? PD: It has more than 20 other keywords in 1st position, so why this one went so far down? Thank you in advance !
Intermediate & Advanced SEO | | Tintanus0 -
How would you link build to this page?
Hi Guys, I'm looking to build links to a commercial page similar to this: https://apolloblinds.com.au/venetian-blinds/ How would you even create quality links (not against Google TOS) to a commercial page like that? Any ideas would be very much appreciated. Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Show parts of page A on page B & C?
Good afternoon,
Intermediate & Advanced SEO | | rayvensoft
A quick question. I am working on a website which has a large page with different sections. Lets say: Page 1
SECTION A
SECTION B
SECTION C Now, they are adding a new area where they want to show only certain sections, so it would look like this: Page 2
SECTION A Page 3
SECTION C Page 4
SECTION D So my question is, would a rel='canonical' tag back to Page 1 be the correct way of preempting any duplicate content issues? I do not need Page 2-4 to even be indexed, it is just a matter of usability and giving the users what they are looking for without all the rest of the extra stuff. Gracias. Tesekürler. Salamat Ko. Thanks. (bonus thumbs up for anybody who knows which languages each of those are) 🙂0 -
First Link on Page Still Only Link on Page?
Bruce Clay and others did some research and found that the first link on the page is the most important and what is accredited as the link. Any other links on the page mean nothing. Is this still true? And in that case, on an ecommerce site with category links in the top navigation (which is high on the code), is it not useful to link to categories in the content of the page? Because the category is already linked to on that page. Thank you, Tyler
Intermediate & Advanced SEO | | tylerfraser0 -
Transferring link juice from a canonical URL to an SEO landing page.
I have URLs that I use for SEM ads in Google. The content on those pages is duplicate (affiliate). Those pages also have dynamic parameters which caused lots of duplicate content pages to be indexed. I have put a canonical tag on the Parameter pages to consolidate everything to the canonical URL. Both the canonical URL and the Parameter URLs have links pointing to them. So as it stands now, my canonical URL is still indexed, but the parameter URLs are not. The canonical page is still made up of affiliate (duplicate) content though. I want to create an equivalent SEO landing page with unique content. But I'd like to do two things 1) remove the canonical URL from the index - due to duplicate affiliate content, and 2) transfer the link juice from the canonical URL over to the SEO URL. I'm thinking of adding a meta NoIndex, follow tag to the canonical tag - and internally linking to the new SEO landing page. Does this strategy work? I don't want to lose the link juice on the canonical URL by adding a meta noindex tag to it. Thanks in advance for your advice. Rob
Intermediate & Advanced SEO | | partnerf0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1 -
My site rank is not consistent. Once it at first page , then for the next week it is not found in top 100 position. Again two/ three weeks later it ranked automatically without any work. Why this is happening?
Here's the following are available in my site: robot.txt file is included sitemap available Natural link building going on. in a week total 100 links we are creating. 30 social bookmarks, 30 directory submission, 20 blog comments, 20 forum links All the blog and forum links are from relevant sources. Please help me ..
Intermediate & Advanced SEO | | coldfireinc0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030