Page: Previous  1, 2

Status: Assistant
Joined: 04 Oct 2003
Posts: 594
Reply Quote
To me, if the includes are requested repeatedly, or if many includes are used that could have been combined into one, each request is a disk access, which is radically slower than a ram request. I suspect that might make a difference. When I make pages, I make it so that most of the templating type stuff I use is loaded in a single templating component file, include_once, then if I need stuff from it I usually pack it into an array and deliver the array contents to the requesting functions. So it's all in ram, basically. That's my idea of it anyway, could be wrong.

I'd also check out the server in the console, run:

:: Code ::

and just watch the processor and memory useage at peak load. If memory is maxing out just increase it by a gigabyte, then check again. If processor useage is maxing out and memory useage isn't you have a problem I think. I'd do all of the below then recheck processor and memory useage.

If the CSS is off the main server then Apache doesn't have to deal with those requests, that's a very large drop in http requests.

Also keep in mind that the standard IP packet size is 576 bytes, 512 bytes of data, the rest headers, so when you optimize the html, css, js, and if any, image files to drop the http connection times per visitor, and the number of packets apache has to keep track of per http request, make sure that you drop the overall filesize down to a number less the previous number you got from calculating the total actual bytes of each file.

For example, 1000 * kB, like 1000 * 12.345 kB, which gives you 12,345 bytes:
:: Code ::
number_of_ip_packets = page_size-bytes/ 512

Round the number_of_ip_packets up to the next integer value. If a file is even 1 byte over it requires a new packet.

If you haven't dropped it down by an integer value, nothing's been saved of any significance. Merging the CSS files makes it easier, pulling it - and any images you might be using on the site, and any js you might have - off the main server will just take that much more work off apache.

I'd be looking very closely at processor useage and memory useage overall before trying to tweak apache, if it's getting maxed due to load that load has to be dropped.

500k a day doesn't seem unreasonable though, seems like it should be able to handle it with some tweaks.
Back to top
Status: Contributor
Joined: 20 Jul 2004
Posts: 122
Location: Central Illinois, typically glued to a computer screen
Reply Quote
I'm certainly no expert on server management - in fact I'm not even up to newbie standards - but one thing I've heard is that mod_gzip can take a lot of CPU resources. You might want to set it up so your CSS files and JS files aren't compressed on the fly, every request. You could then use one method or another to serve statically-compressed versions to the browsers that can take it and non-compressed versions to those that can't. That would free up some processing time.
Back to top
Display posts from previous:   
Page: Previous  1, 2
All times are GMT - 8 Hours