Via Bruce Schneier's, an interesting paper in PNAS on false positives and looking for terrorists. Even if the assumptions of profiling are valid, and the target-group really is more likely to be terrorists, it still isn't a good policy. Because the inter-group difference in the proportion of terrorists is small relative to the absolute scarcity of terrorists in the population, profiling means that you hugely over-sample the people who match the profile. Although it magnifies the hit-rate, it also magnifies the false positive rate, and because a search carried out on someone matching the profile is one not carried out elsewhere, it increases the chance of missing someone.
In fact, if you profile, you need to balance this by searching non-profiled people more often.
The operators of Deepwater Horizon disabled a lot of alarms in order to stop false alarms waking everyone up at all hours. Shock! In some ways, though, that was better than this story about a US hospital, from comp.risks. There, a patient died when an alarm was missed. Why? Too many alarms, beeps, and general noise, and people had turned off some devices' alarms in order to get rid of them.
Unlike Transocean, they had a solution - remove the off switches, because that way, they'll damn well have to listen. At least the oil people didn't think that would work. Of course, they didn't think that if your warning system goes off so often that nobody can sleep when nothing unusual is going on, there's something wrong with the system.