Wednesday, January 28, 2009

Sims

I've been doing a bit of thinking about the concept of using simulations for performance analysis introduced in class. The motivating example used was comparing two types of RPC: one blocking, the other non-blocking. In introducing the idea of simulation, it's obviously necessary to abstract some elements of the system so you don't have to do something ridiculous like worry about where the disk head is at any given point in time. The question then has to be: what do you abstract, and how, and what must you simulate, and to what degree? For example:
1) Maybe you do need the disk involved, but only at a high level. E.g. if the RPC call blocks, and the process has pages swapped to disk, then performance will, in part, depend on things like whether you have a disk cache in place.
2) How much of the OS do you simulate? For network stuff, surely it's important how efficiently the network driver handles different-sized packets. This can change even with point releases to an OS, or if you change out a network card. In which case, the simulation is only good for version X of this OS with Y hardware, and is not really about just the RPC code.
3) What about chaos? I don't mean Windows. Actually, I have a new level of respect for Windows now that I've had to develop on a Mac for a bit (yuck). I mean situations where the equations blow up due to inherent unpredictability (sensitive dependence on initial conditions). You can use simplifying assumptions, and control load, but then how realistic is your sim?

I'm not sure yet whether I like the textbook for the class, but I did like this one quote: "The complexities of interactions among various system components and user programs often surpass our ability to readily identify what system and algorithmic variables are really significant, and in what functional form these variables are related to a chosen measure of performance" (16). The book advises that in order to figure out what is significant, you either need to see the system in production, or do a really detailed sim.

When I started this class, I thought "simulations are simulations." I expected that performance simulations would be similar to the software simulations I used on the job before to verify interfaces and programming logic. I now see that a performance analysis sim is potentially much more complex. There's also something interesting that sims say about the nature of time. In a sim, time is quantized and turned into a sequence of steps that are, given suitable control on inputs, deterministic. A sim can be fast-forwarded or run double-speed, or checkpointed and restored later. You might go to work in the morning and have the sim run through a year's worth of processing by 5:00. I find this sense of creating time to control it poetic. There's a certainty and absoluteness that reminds me of what I like about Math: I prove a theorem and it's proved forever, and the proof has a life of itself, independent of me -- a proof is the creation of something infinite, something that transcends time. This is probably not what T.S. Eliot meant when he wrote "Only through time, time is conquered," but there's a resonance in there somewhere, at least for me.

No comments:

Post a Comment