Page Size
-
Hello Mozers,
What is the best page size ( or the max page size ie KB ) for a home page or a 2nd level page.
Thank you - I appreciate you looking at this question.
Vijay
-
It can also depend on the user expectation, the type of content you're delivering and the nice/demographic you're targeting.
If you're running a site to support education in Africa, you would probably want to make sure the site is optimised for lower band-width connections.
If someone clicks on a link expecting a graphically rich, interactive site then they'll probably be prepared to wait a little longer. (Such as a high-quality images of the week page)
Also, remember that this is going to vary from device to device. No mobile visitor is going to thank you for trying to download gigabytes of data on a page!
As Matt says, keep an eye on your page load speed, look at where your visitors are abandoning your site to see if it's likely that page load times are an issue.
Understand the who your page is aimed at and the technologies/platforms that they are using to consume your content.
On the flip side, there's no point having a page that's quick to load if the content is so brief/thin that it wasn't worth clicking on the link!
Key really is to make sure the content is worth the time and answers the visitors questions/satisfies their goals. When creating pages, think about the user intent and make sure you're designing for the visitor. Remember visitors are investing their time/attention in your site - make sure they get a return!
If you can spend time testing your assumptions then that'll help you make changes based on real data rather than guesswork.
Hope this helps.
-
There is no set answer for this in relation to a specific size - Google released some statistics a while ago and the average page size was 320kb, I remember reading. What I would say though is that you need to concentrate on load times, so make your page as small as possible whilst serving the content how you want it to be, so it loads quickly. Check how quickly you pages load using Google Webmaster Tools as an indicator - though remember this is taking an average from a sample of visits.
And here is a link to a YouMoz post that is very helpful in guiding you through optimizing your page speed - http://www.seomoz.org/blog/optimizing-page-speed-actionable-tips-for-seos-and-web-developers
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Do search engines see copy/keywords when it appears only at the bottom of a page?
My client is looking to improve their SEO, and to date I've written meta data and made some initial recommendations. Thing is, on some of their pages, the body copy appears at the bottom of the page, past links and big, splashy images. My question is, will search engines even see that copy to crawl it for keywords? Thanks!
Web Design | | MarcieHill0 -
What is your opinion in the use of jquery for a continuous scroll type of page layout?
So, I'm in 2 minds about this; let me start with a bit of background info. Context
Web Design | | ChrisAshton
We have a new client who is in the final days of their new site design and were when they first contacted us. Their design essentially uses 5 pages, each with several pages worth of content on each, separated with the use of jquery. What this means is a user can click a menu item from a drop-down in the nav and be taken directly to that section of content like using internal anchor links as if it were a separate page, or they can click the top-level nav item and scroll through each "sub-page" without having to click other links. Vaguely similar to Google's "How Search Works" page if each sector of that page had it's own URL, only without the heavy design elements and slow load time. In this process, scrolling down to each new "sub-page" changes the URL in the address bar and is treated as a new page as far as referencing the page, adding page titles, meta descriptions, backlinks etc. From my research this also means search engines don't see the entire page, they see each sub-page as their own separate item like a normal site. My Reservations I'm worried about this for several reasons, the largest of them being that you're essentially presenting the user with something different to the search engines. The other big one being that I just don't know if search engines really can render this type of formatting correctly or if there's anything I need to look out for here. Since they're so close to launching their new site, I don't have time to set up a test environment and I'm not going to gamble with a new corporate website but they're also going to be very resistant to the advice of "start the design over, it's too dangerous". The Positives
For this client in particular, the design actually works very well. Each of these long pages is essentially about a different service they offer and the continuous scrolling through the "sub-pages" acts as almost a workflow through the process, covering each step in order. It also looks fantastic, loads quickly and has a very simple nav so the overall user experience is great. Since the majority of my focus in SEO is on UX, this is my confusion. Part of me thinks that obscuring the other content on these pages and only showing each individual "sub-page" to search engines is an obvious no-no, the other part of me feels that this kind of user experience and the reasonable prevalence of AJAX/Paralax etc means search engines should be more capable of understanding what's going on here. Can anyone possibly shed some light on this with either some further reading or first-hand experience?0 -
WordPress Category page title h1 or h2
Hi friends, I know this is a minor technical change, but we are in an extremely competitive market and I don't want to have any points against us. On our WordPress Category pages i.e. http://www.domain.com/category/�tegory-title%/ I looked at the code behind the the Title of the category page, which is "Browsing: %Category Title%" The code is an h2. I look at the posts in the category archive below, and those are also h2's. The theme preview is here and you can click on Entertainment - Reviews to see exactly what I'm referring to - http://themeforest.net/item/smartmag-responsive-retina-wordpress-magazine/full_screen_preview/6652608 I changed the code for the "Browsing: %Category Title%" to h1, which I believe is more consistent and standard formatting. 1. Is this a correct technical on-page optimization? 2. Would it be beneficial to remove "Browsing"?
Web Design | | JustinMurray0 -
Dynamic pages -windows server
Hi all, Hope I get an answer on clients site which I believe is hosted on a windows shared server. The output of the site is something like this: http://www.domainname.com/catering-sub.asp?maincate_id=6&maincate_name=Barware I am looking to get a URL friendly output for the site - as far as my knowledge I believe Htaccess doesn't work on this type of hosting? thoughts? Thanks in advance
Web Design | | OnlineAssetPartners0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Landing pages vs internal pages.
Hey everyone I have run into a problem and would greatly appreciate anyone that could weigh in on it. I have a web client that went to an outside vendor for marketing. The client asked me to help them target some keywords and since I am new to the SEO world I have proceeded by researching the best keywords for the client. I found 6 that see excellent monthly searches. I then registered the .com and or .net domain names that match these words. I then started building landing pages that make reference to the keyword and then have links to his site to get more info. My customer sent the first of these sites to the marketer and he says I am doing things all wrong. He says rather then having landing pages like this I should just point the domain names at internal pages to the website. He also says that I should not have different looks for the landing pages from the main site and that I should have the full site menu on each landing page. I wanted to here what everyone here has to say about the pros and cons of the way to do this cause the guy giving the advice to me has a lower ranking site then I do and I have only started working on getting my site ranked this year. He has atleast according to him been doing this forever. Thanks, Ron
Web Design | | bsofttech0 -
Do Pages That Rearrange Set Off Any Red Flags for Google?
We have a broad content site that includes crowdsourced lists of items. A lot of the pages allow voting, which causes the content on the pages (sometimes the content is up to 10 pages deep) to completely rearrange, and therefore spread out and switch pages often among the (up to 10) pages of content. Now, could this be causing any kind of duplicate content or any other kind of red flags for Google? I know that the more the page changes the better, but if it's all the same content that is being moved up and down constantly, could Google think we're pulling some kind of "making it look like we have new content" scheme and ding us for these pages? If so, what would anyone recommend we do? Let's take an example of a list of companies with bad customer service. We let the internet vote them up and down all the time, the order changes depending on the votes in real time. Is that page doomed, or does Google see it and love it?
Web Design | | BG19850