I've got shared hosting on a LAMP set up. Obviously the fewer calls to the Db per page the better. But how many is too many? Two? Ten? A hundred? Curious what people think.
相关问题
- Views base64 encoded blob in HTML with PHP
- Laravel Option Select - Default Issue
- SQL join to get the cartesian product of 2 columns
- sql execution latency when assign to a variable
- Difference between Types.INTEGER and Types.NULL in
remember 100,000 page requests is only just over 1 a second over 24 hours. As long as they all don't request at once.
One or less is always best. Two is usually one too many.
If you can return multiple result sets in a single query, then do it. If the information is fairly static, then cache it and pull from cache.
10 separate database calls is not good, but its not going to kill a low usage site.
I would say that depends on the server load. If you have 1 visitor per minute, then 1-10 db calls per page would be just fine. If your server load is higher than that, say 10 page requests per second, then you should consider caching to minimize the load on your db server.
How long is a piece of string? How long should a man's legs be? How many DB queries should you make on a page load?
There's no single answer. Obviously, making unnecessary queries is a bad idea. Starting excessive DB connections is even worse. Caching unchanging values is good. Beyond that, you can't really arbitrarily say "You should only use $N queries" on a page - it depends on what you're trying to do & what your performance goals are.
In theory, any application could be written to use a single DB query - even if that query is a massive 20-way join involving unindexed full table scans and return thousands of rows that are mostly nulls that would take ridiculous ammount of memory and time to process once it gets to your application. Clearly, this would be a Very Bad Thing. In general, avoid doing things that are obviously wasteful (like doing a bunch of single-row queries in a loop) and worry about performance later.
In the words of Donald Knuth "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil". Everyone talks about 'scalability' like they're really going to be the next Twitter but, in reality, if Twitter had focused on being as big as they are now, they probably never would've gotten a product out the door in the first place.
That really depends on your (db)servers setup. Try to cache most of information as possible and reduce db calls to a minimum. A database will (almost in every case) be the bottleneck of your service - the higher the usage of your site. So whatever you do try to avoid fireing a query as if not really necessary.
I try not to use more than 10 db calls per page, but that really depends on your infrastructure and the information you want to provide.
Don't forget
Tony