The von Neumann architecture - when you're cooking, there is a central processing unit, which is the top of the stove, there is mass storage, which is the fridge and the cupboard, there is a user interface, which is your attention span, there is RAM, which is the work space, there is an output device, the table, and there's also a network interface - the cook's relationship with those around him or her. At any given time, any one of these elements can be the system's rate-limiting factor - but it is a timeless, placeless truth that there is always one that is the system's bottleneck.
More RAM is always welcome - Whether it's fridge I/O, stovetop processing cycles, the interface with the cook, the queue of jobs waiting to be written to the table, or congestion in the social network, it's always the free space in RAM that acts as a buffer for the whole system. If you've got enough RAM, you can cope with most problems without anything dire happening, by either queueing things up or else pre-fetching them from the cupboard ahead of time.
But if you go below a certain threshold level, the system tends to become increasingly unstable and you risk a crash and possibly dinner loss.
Throwing hardware at the problem works...until it doesn't - You can only go so far in clearing space around the kitchen - if your demand for space goes too high, you need a bigger kitchen. Therefore, we need to pay close attention to scaling.
Amdahl's Law and the trade-offs of parallelisation - Doing things in parallel allows us to achieve extremely high performance, but it does so at the expense of simplicity. You can see this most clearly in classic British cooking - many different high-grade ingredients all require different forms of cooking and cook at different rates, but must all arrive at the same time on the plate. Of course, as Amdahl's law states, when you parallelise a process, it's the elements you can't parallelise that are the limiting factor. You can't cook the filling of the pie and the pastry in parallel.
Distributed processing is great...until it isn't - Similarly, distributing tasks among independent nodes allows us to scale up easily and to achieve greater reliability. However, these goals are often in conflict. The more cooks you have in the kitchen, the harder it is to maintain consistency between them, and the more critical it is that you get the networking element of the problem right. Strange emergent properties of the system may surprise you, and it seems to be a law that the consumption of drink scales O(log n) with the number of cooks.
Test-driven development - only fools rely on a priori design to guarantee the quality of their sauce. It's absolutely necessary to build in tests at every step of the cooking process, both to maintain quality, and to stay agile in the face of unforeseen problems and user demands.
The only way to avoid coding bugs is to avoid coding - Ever since the days of Escoffier, cooks have known the importance of using well-known and well-tried recipes as modular building blocks. Escoffier started off with two basic sauces on which he built the entire enterprise of French haute cuisine. So use the standard libraries, and don't try to invent a new way of making white sauce - just type
from sauces import roux
Love the Unix core utilities - Look around your kitchen. What utensils do you actually pick up and use every time you cook? Obviously, you need to invest in the things you actually use, rather than expensive shiny gadgets you don't fully understand. And you need to master the technique of using them. Get a big sharp knife.
Shea's Law - Shea's law states that "The ability to improve a design occurs primarily at the interfaces. This is also the prime location for screwing it up." This is always true of cooking. If you can't get back from the fridge around the inevitable spectators in time to complete a frying loop, or two flavours fail to get together, or you catastrophically fall out with someone in the kitchen, bad things happen.
Loop constructs are fundamental to everything - Perhaps the most important decisions you will make will be whether you minimise how long a process step takes, or whether you minimise the number of steps in the process. But the number of different operations on the CPU - the stove - is the main driver of complexity.
Everyone underestimates the problems of deployment - How will your recipe work in another kitchen, or in the same kitchen under different circumstances?
The hacker ethos - If you have to know what line 348 will be before you open a text editor, you'll never get started. Similarly, you will get nowhere by wondering what temperature in degrees you should saute onions. Chuck your code in the oven, and see if it returns a roast chicken! Also, the fun of having a secret recipe is actually the fun of sharing it with others.
Junk food is bad for you, but sometimes it is unavoidable - Software produced by huge sinister corporations and stuffed with secret additives is likely to make you fat and stupid. But sometimes you need a pizza, or a really fancy presentation graphic, or a pretty music library application for a mobile device. Everyone does it - the thing is to maintain a balanced diet.
No comments:
Post a Comment