Tuesday, August 11, 2009

Recipe

I don't usually publish recipes, but Kathryn really liked this. I was going for a soup that would have a creamy flavor without being too heavy. I cheated by basing this on a can :) Baxter's soup is made in Scotland, not too far where I grew up. But I found cans in our local HEB. This recipe is heretical since vichyssoise is supposed to be served cold. But I don't really care so long as it tastes good.

Salmon and Mushroom Vichyssoise

1 lb of salmon, cut into 1" cubes (with or without skin -- your preference)
2 medium red or white potatoes cut into 1/2" cubes
6-8 oz of mushrooms (button, cremini or portabella), coarsely chopped
1 can of Baxter's Vichyssoise soup
14-16 oz of chicken broth (either a can or bouillon is fine; fat-free/low fat is fine)
1 yellow (crookneck) squash, sliced
1 carrot, sliced (optional)
1 small onion, chopped (optional)
1 tsp turmeric
1/2 tsp dill
1/2 tsp garlic powder
salt and pepper to taste


Place all ingredients in a 3 qt pot. Bring to a boil then simmer, covered, for 40 mins.
Makes 5 large bowls.

Serving suggestions: I served this as an entrée with Indian flatbread (roti) because it's quick to make and uses ingredients I have on hand. You could also add rice to the soup, or serve with another type of bread.

Sunday, August 2, 2009

Fun with beams





It's really nice to see the house as repairs progress. First, the scary bit -- seeing the facade torn off to reveal what is underneath - good or bad. Then the approach gets hashed out. Then the bad stuff gets cut out, representing perhaps years of neglect, and replaced with good, new stuff. And hopefully the replacement process doesn't lead to other things like drywall cracks or plumbing problems. It's so like coding that it makes me wonder when I (or some other poor person) will look at code I wrote, rip off the nice facade presented to users, and declare something rotten.
That's related to something I haven't quite decided in high versus low process systems. Assuming that you neglect the extremes of ridiculous process (analysis paralysis, for example, or where there are process guardians like the gatekeeper of the law in Kafka), and just hacking, where does the point of maximal utility lie, where you get excellent quality of code produced efficiently? That, I assume, is what everyone is looking for, and it is probably unanswerable outside a context: this project, this group of people, this set of time/budget constraints, this set of customers and stakeholders. So, then, does it come down to gut feeling -- "I'm not comfortable with the amount of test time," "I'm constantly bogged down in useless process activities." The process world is full of frameworks, but frameworks need interpreters, customizers, domain specialists. And even then, the results are not always good, and need to be revised. The process specialists have thought of this. Their mantra is "continual process improvement." But in practice, it seems to be far too much costly and disruptive to thoroughly revise processes, and it's embarrassing to admit you've been spending on money on process that's wasted effort, so what you get is process ossification followed by patches to address new cases. And since the goal is to develop sets of processes that can be passed down to future projects, you get process megaliths carted by forklift from one project to another long after the reason for their existence has been forgotten.
Compared to this, hacking seems attractive. But it seems that even better would be to learn what is useful from the process folks, to take the time to understand why certain processes are good and work well. Occasionally, process saves the day, and, having experienced this myself, it doesn't take too many "day savings" to become convinced that there is value in process. But the value is only there a small percentage of the time. And the thing is that the value is often precisely where the developer doesn't want it to be -- in the stubborn anal retentiveness of process, in some ridiculous process that's useless 99.9% of the time. The process that forces you to write down exactly what is on the screen. The process that forces you to put a second and third and fourth pair of eyes on something that obviously works (well, almost). The process that makes you spend hours documenting details that no one will ever look at again. 99.9% of the time you spend doing this is wasted. BUT, there's the 0.1%. And, actually, it's not about understanding the 0.1% process, or believing in it, or thinking of it as worthwhile; only doing it matters, refusing to follow your gut instinct that this is a waste of time, and following it like a slave with as good an attitude as you can muster. Is 0.1% enough to justify this?
If 0.1% is not acceptable, then what number would be enough -- 1% useful? 10% useful? 50% useful? And how would you know -- process metrics never seem to measure these sorts of issues; maybe they can't be measured.
If 0.1% is acceptable, then where does the limit lie? Should we layer on enough process that we code only one line per week, because somewhere at some point there's a 0.0000001% chance that the process will turn up something important? It seems reasonable that there is some level of process that is too much; process at any cost is not justifiable. So, then, there's a need for a metaprocess to decide which processes are worthwhile. But I don't know that I've seen a complete metaprocess of this kind, merely some broad brushstrokes in this direction (e.g. here) and even these are disputed (see here).