Traffic and load issues

King.

New member
Dec 26, 2010
613
6
0
My recent move to a new server has coincided with one of my sites getting sudden bursts of traffic of up to 50 visitors a second, with each visitor averaging 20 pageviews in a short space of time.

My server load goes up to and over 20% whenever this happens and then the site doesn't even load. According to my hosting tech support the site gets maxclients hit. They have explained my site makes a lot of database requests and have recommended caching and apache optimisations.

I'm trying to figure out if this is something that can be fixed with their recommendations or if I just need a new server. Should a server with 3.1ghz quadcore cpu and 16gb ram be able to handle a site like mine? I can't get a straight answer out of my host or the person that works on my site, so I'm not sure what the solution is.
 


If you are getting 50 visitors a second consistently, that is 4.32 million visitors a day.

If that holds true, that 3.1 quadcore will not be able to handle it. Cache it to make it static and see how it goes. Set it up on cloudflare to get an idea on the average visits first. Maybe it's just something going viral.
 
Your website is getting too much traffic and I think your server can't tolerate this traffic with the above configuration. I will suggest you enhance your server configuration.
 
Just to clarify it's not constant 50/sec, it just gets that high sometimes. Daily average is around 200k pageviews.

I got my developer to implement caching. I also increased maxclients to 600 and turned off 'keepalive' in apache configurations - these changes alone helped significantly.
 
I got my developer to implement caching. I also increased maxclients to 600 and turned off 'keepalive' in apache configurations - these changes alone helped significantly.

Any update on this? Or is your developer just finishing up the caching changes?
 
Any update on this? Or is your developer just finishing up the caching changes?
No he finished. It didn't actually have too much of an impact from when I increased maxclients and turned off keepalive. I think this is because my site has too much dynamic content. Overall server is handling traffic better but I still occasionally get database overload errors and CPU reaching 50%.
 
They gave you good advice.

Disk IO is almost always the bottleneck in these situations, so you are basically running out of disk IO for the database server. Run top when your server is getting hammered and see what state the top processes are in ... a lot of them will probably say "iowait" ... you can also look at the %wa at the very top, this is the % of crap waiting on disk io.

I will try to give you some advice. Most of the advice is based on the fact that your bottle neck is disk IO, you probably have a shitty dedicated server with one slow disk. Memory is thousands of time faster then a hard disk.

Here are the steps I would take:

1. Make sure you have optimal indexes on your database tables. Use the MySQL EXPLAIN statement to see what indexes are being used for your queries and how many rows, etc... have to be evaluated.

You may be able to add indexes, or even compound indexes on multiple columns to make things perform better.

2. Check to see if you can reduce the number of queries. I can't tell you how many times I have looked at some code that is using a database abstraction layer and ends up doing 100 queries on something that could actually be done with one.

3. Check to see if you can allocate some more memory for MySQL buffers. If you are using MyISAM tables the main one to look at is the keycache and for InnoDB it is the buffer pool size.

If you are using the defaults.. these are usually pretty low... like 8 or 16 MB. If you have the memory on the machine, increase these. I use 4GB for the buffer pool on my dedicated db server. Read the docs for rules of thumb for how large you can make these.

You can also look into turning on the query_cache and upping the memory for that.

4. Add caching. More importantly, add caching that uses memory. If you keep pulling the same data from a database and displaying it, you are wasting a lot of resources and disk IO.

If you have a news site and only change the news once per day, you technically only have to hit the database once every time the news changes. You can use something like memcached to put that stuff in memory... and update your code to pull it from memory.

5. If you are using PHP, make sure that you have opcache enabled. By default PHP reads the source code from disk and compiles/interprets it on every single page load. This is a horrible waste. An opcode cache will compile the source code once and save it to memory... then periodically check to see if the file modification times on the source code has changed... and if it has reload it.

If you do a good job optimizing your software, even a very low end modern server can serve well over 100 requests per second.

Be sure to use a tool like apachebench after each step to see if you are actually making any improvement.

Just remember disks are slow and suck. Memory is fast, use memory as much as possible.
 
I'd recommend, if you can, switching from apache to nginx. It's miles faster, team it up with varnish and you'll have no problems.
 
I'd recommend, if you can, switching from apache to nginx. It's miles faster, team it up with varnish and you'll have no problems.

I agree that nginx with php-fpm is much better then apache with pre-forking. I'm not sure that they would see much benefit until they get their database bottleneck figured out first though.
 
Are you sure those are live visitors?

I had some sites that were getting overloaded, and when I backtracked the IPs of the hits, over half were coming from Server farms, useless search engines, China and other useless traffic. (Hetzner Online, OVH, Singlehop, xeex, baidu, yandex, colocrossing)

I banned a chunk of IP blocks and some user agents and cut traffic nearly in half.
 
If your website is getting overloaded then it might be time to upgrade your web hosting package. If you have lot of flash images on the website and your website traffic is also increased, then you can't be dependent on a shared hosting plan for a long time. You need a VPS or a dedicated server.
 
put the db on a SSD disk, or cache the db queries (memcache) or do what alpaca wrote.
 
Websites such as facebook, wikipedia and others do not load completely. They load most of the website and hang, leaving a perpetually-rotating green "loading" icon at the top of the tab. It will usually leave out a couple images after having loaded the text. I've adjusted the firefox settings so that it doesn't block pictures coming from outside sources, but it hasn't fully remedied the situation. The other issue is that it doesn't happen every time.