What's the science behind learning how to optimize page loading speed?

Shogun24

New member
Jun 5, 2012
26
0
0
I am about to deploy my first web site, however I don't know beforehand how much RAM I would need to have the pages load under 2 seconds. My question is, are there tools to check the entire memory size of your website, and how that relates to its loading speed before deploying?

For example, an ecommerce site with, say, 50 images, where I only expect a couple thousand viewers per month would have no benefit from a large amount of memory like 1GB correct? I would also like to know how measuring the memory your database takes comes into play.
 


The science is pretty much: Your website's speed will be bound by database I/O across the board. The database is the slowest interaction in a request.

RAM isn't a generic bottleneck unless you're doing something wrong like buffering large file uploads into server memory instead of streaming them. But that's pretty much N/A for you.

So, the science becomes: Cache everything that you can so you avoid database requests.

If your ecommerce homepage displays the 10 top-selling items, you shouldn't be running this every time someone loads the page:

SELECT * FROM products ORDER BY amount_sold DESC LIMIT 10

Ideally, you'd run that once and write it to "index.html". Then you server will just serve up "index.html" instead of hitting your database, and serving up static html files is blazing fast. That's literally what the popular Wordpress plugins do -- write everything into .html files and only update them when, say, you add/edit a post.

Doing this by hand is pretty non-trivial unless you have some programming experience, but frameworks (including your ecommerce platform) generally have cache settings baked in.

Buying more RAM and CPU won't increase the speed of your new website. Those two seconds the user has to wait is caught up in data/network transports, like making the request to download your jQuery file hosted on Google, loading the products from the database, and the general latency between themselves and your shitty China server.

I'd look into a performance analytics tool like the god-tier http://newrelic.com/. It shows you exactly where bottlenecks are like which DB queries are long-running. It breaks the response time into network, database, and rendering.
 
Wow... thank you so much, really. I honestly just expected to be dick-rolled 5 times.
 
The science is pretty much: Your website's speed will be bound by database I/O across the board. The database is the slowest interaction in a request.

RAM isn't a generic bottleneck unless you're doing something wrong like buffering large file uploads into server memory instead of streaming them. But that's pretty much N/A for you.

So, the science becomes: Cache everything that you can so you avoid database requests.

If your ecommerce homepage displays the 10 top-selling items, you shouldn't be running this every time someone loads the page:

SELECT * FROM products ORDER BY amount_sold DESC LIMIT 10

Ideally, you'd run that once and write it to "index.html". Then you server will just serve up "index.html" instead of hitting your database, and serving up static html files is blazing fast. That's literally what the popular Wordpress plugins do -- write everything into .html files and only update them when, say, you add/edit a post.

Doing this by hand is pretty non-trivial unless you have some programming experience, but frameworks (including your ecommerce platform) generally have cache settings baked in.

Buying more RAM and CPU won't increase the speed of your new website. Those two seconds the user has to wait is caught up in data/network transports, like making the request to download your jQuery file hosted on Google, loading the products from the database, and the general latency between themselves and your shitty China server.

I'd look into a performance analytics tool like the god-tier Web Application Performance Management (APM) : New Relic. It shows you exactly where bottlenecks are like which DB queries are long-running. It breaks the response time into network, database, and rendering.

Sorry for how noobish this is, but does getting a shared database or a VPS matter in regards to the speed of my ecommerce platform (the database speed)? I was told to start out with a VPS...
 
Sorry for how noobish this is, but does getting a shared database or a VPS matter in regards to the speed of my ecommerce platform (the database speed)? I was told to start out with a VPS...

A mediocre VPS should start you out just fine. Make sure it has 512MB ram or greater. Start out small you'd be surprised at how much power you get from a lean mean linux server.

Also like someone else said, the DB is a big problem. Make sure the DB is hosted on the same box as your server or in the same data center for best speed.

Any column you would match off of make sure index. Also only grab the columns you need. Use LIMIT for pretty much every query where you know the predetermined number of records being retrieved or updated. Never use RAND(). Every once and a while run the optimize table command on all your tables. If you have no use keeping old data, get rid of it. Make good use to leveraging server memory and query the DB as infrequently as possible. Use sessions to keep track of users, etc...

If your page-loads are taking 2 seconds and your not doing a bit of traffic, you're probably doing something wrong.

Buying more RAM and CPU won't increase the speed of your new website.
I don't see how it wont. The more resources you have at hand the better performance you'll get and the more traffic you can handle.
 
The science is pretty much: Your website's speed will be bound by database I/O across the board. The database is the slowest interaction in a request.

RAM isn't a generic bottleneck unless you're doing something wrong like buffering large file uploads into server memory instead of streaming them. But that's pretty much N/A for you.

So, the science becomes: Cache everything that you can so you avoid database requests.
There's a lot of misinformation in this post. Your website will not always be bottlenecked by the database. Increasing the CPU and RAM will very likely increase performance. Also, for an eCommerce site, you likely won't be able to implement page caching as this post mentions because almost every page on your site will be dynamic. You want to tell the user how many items are in their cart, right? You want to show them their username if they are logged in? There are some ways to get around this with Javascript but for your case, don't worry about it for now. Several frameworks allow for page fragment caching which would also get around this.

If you're setting up a VPS, start with the least amount of RAM your provider offers. You can always bump this up later without migrating to a new server. Get your site running and throw some traffic at it and see how it performs. Then optimize.

You should also evaluate your site against Google Pagespeed and YSlow.

Premature optimization is bad news.
 
If you're setting up a VPS, start with the least amount of RAM your provider offers. You can always bump this up later without migrating to a new server. Get your site running and throw some traffic at it and see how it performs. Then optimize.

I would disagree there, I would recommend at least 512 MB of ram or maybe 400 something. Lots of places offer a 256 MB ram package, and there really isn't enough overhead to do much, get get basically any traffic and your site will fall over. Most LAMP setups will be pushing toward the 200 MB of RAM usage range in my experience so it really doesn't give you much room to work with.
 
the amount of ram you have will dictate how many server processes you can run (generally, you'll have a process per request/visitor if you're using apache). So the more simultaneous requests, the more ram you'll need.

You might also need more ram for background queues, databases indexes, etc. You can use something like apache bench to stress test your server a bit if you want.
 
I would disagree there, I would recommend at least 512 MB of ram or maybe 400 something. Lots of places offer a 256 MB ram package, and there really isn't enough overhead to do much, get get basically any traffic and your site will fall over. Most LAMP setups will be pushing toward the 200 MB of RAM usage range in my experience so it really doesn't give you much room to work with.
Like I said, generally with a VPS, you can upgrade RAM with a click of a button. And ya, you're right, down the road, you'll probably need to upgrade to more RAM.
 
If you have a VPS or dedi this is simple. Get the Varnish Script from unixy, it blows litespeed,apache out of the water as far as page/database speed loading.
 
Increasing the CPU and RAM will very likely increase performance. Also, for an eCommerce site, you likely won't be able to implement page caching as this post mentions because almost every page on your site will be dynamic. You want to tell the user how many items are in their cart, right? You want to show them their username if they are logged in? There are some ways to get around this with Javascript but for your case, don't worry about it for now. Several frameworks allow for page fragment caching which would also get around this.

If you're setting up a VPS, start with the least amount of RAM your provider offers. You can always bump this up later without migrating to a new server. Get your site running and throw some traffic at it and see how it performs. Then optimize.

You should also evaluate your site against Google Pagespeed and YSlow.

Premature optimization is bad news.

The general ecommerce platform has a menagerie of DB queries on every page. That's the bottleneck. OP is going to have query performance issues before he needs to scale beyond his initial cheap 512mb virtualized core instance.

I don't know what yall think goes into memory, but it's cache, DB heap, and process replication. Which one of those is going to OOM this guy's new ecommerce app? His 128kb yaml file of products?

Yeah, page caching was just an example to help explain what caching is to someone that isn't very experienced. You're obviously relegated to fragment caching with non-trivial pages, but the distinction is rather futile when the concept of caching in general is more important to understand.
 
honestly, when dealing with page loading speed, you need to be using a request profiling tool (most frameworks come with them, or you can set up your own).

Based on what the profiler shows you, you'll easily see your bottlenecks. Could be a lot of small asset requests, a bunch of really unoptimized queries, dynamic image resizing, etc. It's all relative to the site in question.

Just profile and optimize
 
most VPS gives you certain amount of RAM, but they also give burst RAM around double to accommodate brief periods of time when additional RAM may be needed. If you're not getting a shitload of traffic, burst RAM would be available on the initial page load, then the static assets will typically be stored in the user's browser cache.

MySQL keeps some stats on RAM usage in a table, but the framework's profiler can give you all the data you need. For images, scripts, etc., firebug's 'net' panel can do client-side profiling for that.
 
The science is pretty much: Your website's speed will be bound by database I/O across the board. The database is the slowest interaction in a request.

The fuck you are talking about.

I've put arrows next to the shit that is important, learn something and stfu.

v74bk6.jpg
 
OP frames his concern around VPS resources, so, with perhaps too much tunnel-vision, that's what I addressed.

Are there other aspects to how responsive your website is to a user? Sure.
 
OP frames his concern around VPS resources, so, with perhaps too much tunnel-vision, that's what I addressed.

Are there other aspects to how responsive your website is to a user? Sure.

Understood, but short of the OP having incredibly inefficient code, a massive database, or a total lack of any javascript, I'd like to see a fairly solid example of the initial page request (html only) taking anywhere near 2 seconds to load, short of a 3rd world net connection.

For example, vBulletin is a bloated inefficient software, and even with the extra extensions, and the 1.6 million posts WF has, I'm still only seeing ~600ms requests. People need to focus on the browser first, and deal with the hardware/software scaling later.
 
Understood, but short of the OP having incredibly inefficient code, a massive database, or a total lack of any javascript, I'd like to see a fairly solid example of the initial page request (html only) taking anywhere near 2 seconds to load, short of a 3rd world net connection.

For example, vBulletin is a bloated inefficient software, and even with the extra extensions, and the 1.6 million posts WF has, I'm still only seeing ~600ms requests. People need to focus on the browser first, and deal with the hardware/software scaling later.

almost all forum software relies heavily on caching in production. Same with something like Drupal, which is dog shit slow in development mode but works just fine when in production mode (caching turned on).
 
almost all forum software relies heavily on caching in production. Same with something like Drupal, which is dog shit slow in development mode but works just fine when in production mode (caching turned on).

Yet vBulletin makes 14 SQL requests on a forum index, 12 on a thread index, and 9 on a thread request - and that is on a a copy with more caching than the default, and not even a logged in user.

Page generated in 0.33544397354126 seconds with 9 queries, spending 0.0035548210144043 doing MySQL queries and 0.33188915252686 doing PHP things.