Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
My Site Is Using A Lot of Hosting Bandwidth. Suggestions?
-
My website http://www.socialseomanagement.com/ is using tons of bandwidth. I received a message from the hosting company saying I exceeded my monthly bandwidth and it has only been a few days. Can anyone take a look and make suggestions?
Thanks
-
Something doesn't sound right there. You can't have 3112 unique visits from 12093 unique visitors, unless your log file program is using those terms in a different way. If your log file program lets you, you should be able to see which files haven been using the most bandwidth.
-
As of the October 4th these were my statistics for the month of October:
These statistics are based on the October 2012, so far:
- 427K hits
- 3112 unique visits from 12K sources
- 2.65GB data
- 12093 unique visitors
-
Hmm.. Based on the other responses above my load time was not very long. Did you try visiting more than once?
-
Is there a way to test the speed of the server also? I am checking my server logs as you suggested to see if someone is linking to our images...
-
The reason the site "looks" to tools like it takes a long time to load is because the Google+ API call and the Facebook Connect call are taking an obscenely long time to negotiate their SSL connections - over 11 seconds.
Fortunately the essential user-visible parts of the page (up to document complete) load quickly so the user experience is fine. Might be worth trying to figure out why the SSL negotiation on those 2 files is getting killed so badly though.
Here's a link to the page load Waterfall View that clearly shows the long purple lines indicating SSL delay for those 2 scripts.
-
Your homepage is a little over 900kb which isn't massive by any means. Your homepage loads in a little over 3.5 seconds from a Virginia location using the equivalent of a slow cable connection so it looks very-well speed optimized to me.
If you've seen that big a jump in bandwidth use with no accompanying jump in traffic, my strong guess is that some other site is leaching your content (otherwise known as hotlinking).
Typically, this means somebody else has embedded images from your site into their own, meaning every time their page displays the image, it loads form your server/bandwidth instead of their own. (Can also happen with other file types)
Some sites do this because they don't know any better, and sometimes it's malicious. I've seen this often happen from forums where a whole page of post images loads every time the thread is viewed. To see if this is happening, you'll need to check in your server logs. Google Analytics won't show it as the images aren't tagged with the tracking code.
If hotlinking is the problem, the only way to stop it is to tell your server to only display images that were requested from your own website pages. You do this through the htaccess file. If your site uses cPanel for it's hosting control panel, there's actually a button on the Panel in the Security section to disable hotlinking. I assume most other control panels have something similar. Be aware that this will disable ALL images from showing on other sites, including any badges etc you may have created for other sites to display intentionally.
Paul
-
Do you have plugins?
-
Load time can be a server issue, a software/coding issue, and/or a filesize issue.
4GB of bandwidth in 4 days is a lot - that's how much my site that gets 30K+ uniques per month has used. Based on Alexa rank, it looks like your site doesn't get that much traffic? Have you had an uptick in traffic?
You can check page and file sizes at http://tools.pingdom.com/fpt/ That tool shows your homepage and associated files has a total size of 1MB. To use 4GB of bandwidth at that rate, you'd need ~4,000 pageviews.
You could run your top pages through that tool and see if any of the embedded images or other files are too large.
You could also check your log files to see if someone has embedded an image from your site, you're getting crawled by robots, etc.
Hope that helps!
-
Plus the site already caches...
-
I am using Drupal.
-
Nearly 4gb in 4 days... The slider images are rather large. Are there any good compression tools? Yahoo smushit didnt make much of a difference.
Any advice on how to make it load faster? Or is that a server issue?
-
The site appears to use Drupal. A cache plugin may make the pages load faster, but it won't change the amount of bandwidth used.
-
Well, you're probably either getting more traffic and/or maybe you need to optimize your file sizes.
How much bandwidth have you used?
Usually images or videos are what gobbles up bandwidth. Are you hosting any large images, videos, or files?
-
Hi James,
Your site loading takes quite some time. Although graphics load quickly, there is still something loading in the background.
I suggest checking out which plugins you are using and disable one by one to see what could be conflicting with the load times.
I'm not sure which CMS you are using but I suggest getting a cache plugin to make loading times more quick if its images slowing load times(probably not since graphics loaded quick).
-
Load time was pretty long. How are your file sizes?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is managed wordpress hosting bad for seo?
hi, i would like to create my own website, but I am confused either to choose cpanel hosting or managed wordpress
Web Design | | alan-shultis0 -
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Do Wordpress sites outrank SquareSpace?
I was a big fan of Wordpress. I used it for 10 years. However, because I run a very small business, the constant upkeep needed on WP in the end started to frustrate me in the end, so I moved to SquareSpace. However, I am beginning to question my decision, as one of my sites is struggling really badly, and I mean badly. The other sites are okay. So I started asking around, and most people are saying there shouldn't be a difference. A few people have said their Wordpress sites always outranks their SquareSpace sites. Then I read what Rand Fishkin said in the below Twitter thread, now I am even more confused. I am very reluctant to move to Wordpress, its just so much hassle. But at the same time, if a site doesn't get much traffic then it's useless. https://twitter.com/drew_pickard/status/991659074134556673 https://twitter.com/randfish/status/991974456477278209 Please let me know your thoughts and experience.
Web Design | | RyanUK0 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Is it better to redirect a url or set up a landing page for a new site?
Hi, One of our clients has got a new website but is still getting quite a lot of traffic to her old site which has a page authority of 30 on the home page and has about 20 external backlinks. It's on a different hosting package so a different C block but I was wondering if anyone could advise if it would be better to simply redirect this page to the new site or set up a landing page on this domain simply saying "Site has moved, you can now find us here..." sort of idea. Any advice would be much appreciated Thanks
Web Design | | Will_Craig0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | | DownPour0 -
Footer backlinks for sites I've developed
I link back to my website via my company name on the footers of sites I develop. Lately I've been changing this to my keyword and mixing and matching. This has been done for new sites I create and old sites I've not seen any benefit so far after a couple of months. Most my clients are hosted on the same server as my main site that it links back to. 1. Is this a bad idea to link back on the same IP?
Web Design | | sanchez1960
2. Is footer backlinks to the main developer going to annoy Google?
3. Should I change my main site's server, will it help? All my competitors seem to do it and as far as I can tell they seem to get better results than I do. Because I'm now changing them the reason I see no benefit? Thanks0