HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
-
Hello...
I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador...
I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again...
I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
-
I like that idea but I'd consider alternatives to Facebook and 302 temporary redirect the article. When the traffic decreases I'd want to remove the redirect so all of those links and shares benefit my own website, and not another website.
-
It may be too late to help you but in case someone else has a similar problem:
-
Repost the article (or much of it) on Facebook.
-
301 redirect your viral post to the new post. Images, post, content, etc. won't load at all - visitors will be directed to a server that can handle the load.
Also, by directing to your FB page, you may have more likes, more shares and definitely more fans by the end of it.
-
-
To see incoming links,
-
Google Webmaster Tools?
-
Also if 1) you can use www.majesticseo.com (you need to approve via GWMT so that it shows all the necessary data
-
and of course you can use opensiteexplorer.org
But, and this is a great but, Facebook, linkedin and twitter will not show you lower level links, just a general link facebook.com/ or linkedin.com/ this will only help you identify the media not the post...
You will just see in your analytics as Alex said in "social referrals" so then to know exactly which facebook post was THE ONE which brought you traffic you should have previously, now late for past traffic but still possible for the actual one, introduced tags in your posts so as to know it came from this specific one. There is a nice post about this in Ana´s Kravitz,(like lenny but not her sister... she did an amazing work on this and is very easy to DIY.
Unfortunately if you do not suspect from which post this specific and unusual traffic could have come... then, there is very little to do about the past IMHO, maybe sombody else could give you another hint... but I doubt it. Only to be tidy enough to tag every single new post that you make and track it on analytics as Alex said.
Here the link from Ana´s blog, good luck and if you need any further help do not hesitate to ring my bell.
http://www.akravitz.com/tag-track-social-media-traffic-for-google-analytics/
Cheers from a cloudy (once in a very long while) day in usually sunny Southern Spain.
-
-
I often find searching for the URL directly on Twitter and the Facebook public posts search is the best way to find out.
If you use Google Analytics you could go Acquisition > All Referrals and click on e.g. facebook.com to see a list of referring URLs. "/" (the homepage) will probably be most of the referrals but there might be some group or page URLs in there. Acquisition > Social > Network Referrals is another route.
-
Thank to everybody for comments and help, I will use this cloudfare to see if works for me...I will try to work with the images too and put some adsense
I was seriusly thinking to move my site to wordpress.com , but If I do that I could loose severals plugins and benefits of this cms
Someone know how to track the original post on fb or twitter that make all this viral stuff?
Im using socialmentions and fresh web explorer to find people talking about my website but I can not find the original one that did this viral, I would like to know how all this happen? I think it can be a valuable information for all of us...
-
I just want to say.... "Nice work!" on creating some content that went viral.
Here is something to think about for the future....
If you are receiving lots of traffic from other countries on a regular basis you could use Google's Double Click Ad Server to display income-producing ads to visitors from countries outside of Ecuador. You simply create a dedicated space on each of your pages where the ad will be displayed. When a visitor from Ecuador arrives that person will see an ad for one of your Spa products. When a person from outside of Ecuador arrives that person will see an adsense ad.
Just typing this has given me an idea on how I can profit from this on one of my sites. The reward of commenting on forums.
Good luck!
-
Hola Jonathan!
If all those visits are real (and you are not being hacked) I would be definitely happy, (as a matter of fact much more than happy) and prepared to jump from this hosting to a much more powerful one, even if you have to invest on doing so. That sort of viral effect is something we marketers are searching as the holy grail in the crusades… you shouldn’t reject it, on the contrary you should make good use of it.
You will profit on SEO, look it this way, you would have to invest a huge amount of money to get this relevant traffic.
Build a strong SEO on your basic set of keywords around this post, and profit on this, it will have a huge impact and payback on SEO positioning. Even though your business is local you will profit on relevant, real and viral traffic into your site, wherever this traffic comes and specially if you could transform your blog in magnet for relevant keywords.
Relevant traffic (wherever they come from) will bring you positioning, and positioning (also in Ecuador) will bring you visits (from Ecuador) & that you can transform into more clients to your local store.
If you do not know how to do this, here in Moz you will find lots of friends that will try to help you doing so…
Hasta la vista!
Cheers from an incredibly nice Saturday winter morning in southern sunny, always sunny, Spain.
-
Jonathan, there's no point wasting your time trying to optimise images at this point. Your images are such a tiny percentage of your server load right now that working on them will likely have no effect on whether your host will allow your site back online.
Your host is absolutely right - shared hosting is simply not capable of running php and serving requests for 40,000 visits a day for a WordPress site.
There is something that may help though! Your best chance is to ask your host to help you set up CloudFlare. (It may already be available to you if your hosting account uses the cPanel control panel.) CloudFlare is called a Content Delivery Network or CDN. What it does is take most of the content from your site and store it on hundreds of servers around the world. Then, when a visitor comes to your site, CloudFlare figures out which of it's servers is closest to the visitor, and supplies the files from there. So a huge amount of the work gets done by CloudFlare's servers instead of your own. Using it will typically reduce your server requests by 50-60%.
The basic version of CloudFlare is free. It's pretty straightforward to set up, but having some help from your host will make it easier. Once it's in place, it's quite possible (but unfortunately no guarantee) it will reduce your server load enough to keep your host happy.
Your only other alternative will be to block a whole more of the unwanted countries, or upgrade to a stronger server.
If you have any questions, be sure to ask!
Paul
P.S. One other thing that can really help temporarily, even with CloudFlare, is to turn off comments on the posts that are getting the most traffic, assuming there are comments at all.
-
I will try to get access to my cpanel account again and try to do it...
This is what the hosting company says:
Nevertheless, after checking the apache access logs for divaestetica.com.ec during the hour preceding the last automated CPU block that this account triggered, I observed 111997 total http requests from 2685 unique IP addresses. This is quite a bit of traffic for a Wordpress site to handle by itself on a shared server. -
Have your images been optimized for the web? Perhaps with the smush.it plugin? Making sure that if it's a fairly simple image it's only taking up 20k and not 100k. Perhaps there are images on the page getting all the traffic that make for a nice header, but you could remove or replace with an image that's a smaller file size? Are there widgets loading facebook and twitter information that could be temporarily disabled?
The links I sent also have some suggestions for situations like this.
-
Hi Keri, thank for your answer...
Yes I am using wordpress, and I am already using super cache plug in for wordpress, I update for the last version of wp and last update of all plugins, also I blocked severals IP from some countries but I am still receiving visits from other places...
What do you mean with stripout extra images?
-
In case it is WordPress, here are a couple of posts about how to optimize the site for heavy traffic. Can you then go to your host and say that you've reduced the file sizes on images, are loading fewer items, etc. and then turn the website back on?
http://codex.wordpress.org/High_Traffic_Tips_For_WordPress
http://wpengine.com/2012/02/22/how-to-prepare-wordpress-for-an-onslaught-of-traffic/
http://speed.wpengine.com/ (for when the site is enabled)
-
Is your website on Wordpress or another CMS? If so, is there any caching plugin you can use? Can you strip out any extra images temporarily until your traffic dies down, to help reduce your bandwidth being used?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Audit: Indexed Pages Issue
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://moz.com/blog/technical-site-audit-for-2015 . One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it. Here are some things I've done I ran another query "info:homepage.com" and the home page is indexed by Google. When I run a branded search for the company name the home page does come up first. The current page that is showing up first in the "site:domain.com" listing is my blog index page. Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting. In the sitemap I removed the index.php and left only the root domain as the page to index. Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page. Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings. Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
Local Website Optimization | | SEO_Matt0 -
Recommended blogs and sites about local seo
HI.
Local Website Optimization | | corn2015
Can you please tell me some great blogs/sites to read daily about local seo? I'm really wanting to beef up my knowledge in this area to assist local businesses. Corn1 -
Canonical for blog tag or search site
Dear all, I have problem with duplicate content on my site and crawled by seomoz as "duplicate content", might be i am not clear enough about how to put "canoncial" but the problem is with my site mostly on blog or tags or categories, so some link that actually different tags ....come with same result..so like: http://www.livingwordfreelutheran.org/news-events/blog/tag/ Gymnastics and http://www.livingwordfreelutheran.org/news-events/blog/tag/ God's Power It will show same result..the problem is,all are dynamic... and what i should put the canonical for that page? Both of link use same page or controller? If i put the canonical itself on each result it will be fix it? Or how? …and also I confusing how I put it also on search result? Like ?query=keywords that show same result? How I put canonical on there? Sorry if this duplicate question... I very very appreciate for the help…thank you! Best regards,
Local Website Optimization | | lwflc
Harrison0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Why a site just dropped out of ranks
I have a site i am working on link It was ranking fairly well and then I added content to the homepage to better optimize it for the keywords it was already ranking for. Ever since I did that the entire site is no longer ranking well. Thoughts?
Local Website Optimization | | Atomicx0 -
Need advice on direction to go with site
I am taking over this site and redoing it all over. I believe that google may have penalized the site because the site doesn't show up in the SERPS, but will show under a google search (site:prosplumbingsanjoseca.com). I am just asking for your opinions on what I should do to correct the issues with this site and get back into the SERPS.
Local Website Optimization | | mikezaiss0 -
Moving back to .com site
Hi Many thanks for all the input we have had from the Moz expert team here. We have had some great thoughts and we have finally decided that we need to move our site to an new provider and to go back to one single .com site for all our global traffic, as we cannot get round possible duplicate pages as we cannot use canonical nor alternate links with our current website provider and this has meant a big rethink in the last couple of weeks. We where running two sites, .com which has been running for 7 years and a .co.uk site which was dormant since 2007 until 2013 and used from last year to serve our local customers. Domain Authority for .com 19 and 23 for .co.uk Our new site will serve 3 currencies so we can offer £ $ & € without the need for duplicate pages or local pages. We plan but are flexible about using a 301 from the .co.uk site to the dot com. and have enough data to ensure we can do all 301 redirects at page level from our current .co.uk site to our new .com site. Can anyone provide any SEO tips on ensure we grow our rankings when we make the switch in about 3 weeks. Many thanks Bruce
Local Website Optimization | | BruceA2 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0