Adding to my reading list

This is the sort of unanticipated problem with reading blogs creates. I have too much interesting reading in the queue as it is. Brad may have gotten some free time, but he’s going to cost me some of mine.

Notes: More Free Time Tomorrow.

Raj Arunachalam has cancelled his appointment with me tomorrow: he’s flying to George Mason to interview James Buchanan and Gordon Tullock.

I have owned seven copies of Buchanan and Tullock’s (1962) The Calculus of Consent: Logical Foundations of Constitutional Democracy in my life. I keep loaning my copy out to graduate students: “You haven’t read this? You must read this!” They keep liking it so much they don’t return it. So I go and buy another one.

[Brad DeLong’s Semi-Daily Journal (2004)]

After Action Review Toolkit

This is a nice process for running AARs together with a case study of AARs in action. AARs are a simple and powerful technique for discovering and communicating lessons learned. They work especially well in project-based environments. If they aren’t already in your bag of tricks, they should be

After Action Review Toolkit. Allison Hewlitt has posted a draft After Action Review Toolkit, which provides a practical step-by-step process for running these reviews. To quote: The AAR is a simple process used by a team to capture the lessons learned from past successes… [Column Two]

Richard Morgan's Altered Carbon – 50 Book Challenge

Altered Carbon
Morgan, Richard

This was getting lots of buzz in different places that I trust. I picked it up and browsed it in the book store and put it back several times before I finally decided I was going to read it. Glad I did.

What makes good science fiction work, and the reason I continue to make it such a major component of my fiction reading, is to make plausible hypotheses about a technology innovation and then be relentless in pursuing that “what if” wherever your understanding of behavior and society leads. Morgan does exactly that. If technology could give you a real-time backup brain that essentially lets you cheat death (but not pain), where does that take you? To some frequently nasty but compelling places.

John Brunner's Shockware Rider – 50 Book Challenge

Shockwave Rider
Brunner, John

Long before William Gibson launched the genre of “cyber-punk”, Brunner was writing about the impact of information technology and accelerating change on society. This is Brunner’s effort to understand what Toffler’s Future Shock might feel like in human terms. To me, it’s one of the more effective examples of why someone once described science-fiction writers as the “advance planning department for the human race.” And it’s a hell of a good story, besides.

I re-read this story every couple of years and still find it compelling. I marvel at Brunner’s ability to extrapolate how the collision of technology and human nature is likely to play out. This time I picked it up again because of a recommendation from Evil Genius Chronicles to check out some music written as a backdrop to the story, which was also worth the download.

Complexity and design

If you think that technological systems are complex, imagine what that implies for the combination of technological and social systems. The socio-technical systems arena has been a rich vein that’s been mined in the organizational design and development world for decades. In general, though, that literature has been ignorant of the world of systems design (and vice versa, of course). These are some of my favorite quotes on the topic and it’s so nice to see that someone else has done the work of assembling them for me :).

On Complexity.

There are two ways of constructing a software design: One way is to make it so simple there are obviously no deficiencies and the other way is to make it so complicated that there are no obvious deficiencies.
— C.A.R. Hoare

These new problems, and the future of the world depends on many of them, requires science to make a third great advance, an advance that must be even greater than the nineteenth-century conquest of problems of simplicity or the twentieth-century victory over problems of disorganized complexity. Science must, over the next 50 years, learn to deal with these problems of organized complexity. […] Impressive as the progress has been, science has by no means worked itself out of a job. It is soberly true that science has, to date, succeeded in solving a bewildering number of relatively easy problems, whereas the hard problems, and the ones which perhaps promise most for man’s future, lie ahead. We must, therefore, stop thinking of science in terms of its spectacular successes in solving problems of simplicity.
— Warren Weaver

In our time, the technology of machines has drawn its inspiration from mechanics, dealing with complexity by reducing the number of relevant parts. The technology of government, on the other hand, has draw upon statistical mechanics, creating simplicity by dealing only with people in the structureless mass, as interchangeable units and taking averages. […] For systems between the small and large number extremes, there is an essential failure of the two classical methods. On the one hand, the Square Law of Computation says that we cannot solve medium number systems by analysis, while on the other hand, the Square Root of N Law warns us not to expect too much from averages. By combining these two laws, then, we get a third – the Law of Medium Numbers:

For medium number systems, we can expect that large fluctuations, irregularities, and discrepancy with any theory will occur more or less regularly.

— Gerald M. Weinberg

[Incipient(thoughts)]

Learing caution in designing technology for organizations

There seem to be a whole series of great entries showing up in my aggregator around the theme of how technology interacts with the world at large (that’s the point of using RSS aggregators isn’t it?).

The more time I spend trying to mesh technology with organizations, the more cautious I become. I still believe that carefully designed and deployed technology is essential for organizations and societies that hope to survive. But that design has to factor in how human systems shape designed systems over time. One of my own design goals is to seek to channel and shape that evolution so that unintended consequences are a smaller percentage of the outcomes and that there is a higher probability that the unintened consequences are more likely to be desirable than undesirable. One important aspect of that is to be very clear in pointing out things I believe to be technologically impossible. Technology cannot be the right answer to every question.

Good and Bad Technologies.

Fred writes about Clay Shirky’s comments about good and bad technologies and freedom to innovate:

The thing that will change the future in the future is the same thing that changed the future in the past — freedom, in both its grand and narrow senses.

The narrow sense of freedom, in tech terms, is a freedom to tinker, to prod and poke and break and fix things. Good technologies — the PC, the internet, HMTL — enable this. Bad technologies — cellphones, set-top boxes — forbid it, in hardware or contract. A lot of the fights in the next 5 years are going to be between people who want this kind of freedom in their technologies vs. business people who think freedom is a shitty business model compared with control.

And none of this would matter, really, except that in a technologically mediated age, our grand freedoms — freedom of speech, of association, of the press — are based on the narrow ones. Wave after wave of world-changing technology like email and the Web and instant messaging and Napster and Kazaa have been made possible because the technological freedoms we enjoy, especially the ones instantiated in the internet.

The internet means you don’t have to convince anyone that something is a good idea before trying it, and that in turn means that you don’t need to be a huge company to change the world. Microsoft gears up the global publicity machine its launch of Windows 98, and at the same time a 19 year old kid procrastinating on his CS homework invents a way to trade MP3 files. Guess which software spread faster, and changed people’s lives more?

Simple, and so true!

[E M E R G I C . o r g]

A very accurate description of where we are going. Groups that fail to recognize this will not succeed. [A Man with a Ph.D. – Richard Gayle’s Weblog]

The Dark Side of Numbers: The Role of Population Data Systems in Human Rights Abuses

Here’s an excellent and chilling example of the importance of thinking carefully about complex opportunities and problems. Technical rationality is too narrow a perspective to adopt in an interconnected world. At the same time, we can’t simply cede control to power elites that believe that either nothing or everything is possible with technology.

The Dark Side of Numbers: The Role of Population Data Systems in Human Rights Abuses..

Many people have heard me tell an anecdote that i learned while living in Holland: At the turn of the century, the Dutch government collected mass amounts of data about its citizens with good intentions. In order to give people proper burials, they included religion. In 1939, the Nazis invaded and captured that data in less than 3 days. A larger percentage of Dutch Jews died than any other Jews because of this system.

Well, i’d been searching for a citation for a while. Tonight, i remembered to ask Google Answers and in less than an hour, had a perfect citation:

The Dark Side of Numbers: The Role of Population Data Systems in Human Rights Abuses. Social Research, Summer, 2001, by William Seltzer, Margo Anderson

The essay is even better than my anecdote and i truly believe that anyone in the business of doing data capture should be required to read this.

[apophenia]

National identity cards, wicked problems, and tough thinking

Let me also put in a plug for Bruce Schneier’s analysis on this and other security related topics. These are critically important issues in their own right.

They are also examples of what Horst Rittel termed “wicked problems.” What I think particularly important about wicked problems is that they require much more subtle and nuanced thinking. Schneier provides excellent examples of just that kind of thinking.

If this general topic interests you, another place you might want to look is the work of Jeff Conklin who’s built some very interesting systems and process thinking on top of Rittel’s work. His work is available at CogNexus Institute. Be sure to take a look at “Wicked Problems and Social Complexity” (pdf file) and “Issues as Elements of Information Systems” (pdf file) which is Rittel’s original paper on the topic.

Identity cards considered harmful.

In the week after David Blunkett came out in favour of issuing a national ID card in the UK — and making it compulsory by 2010 — Bruce Schneier, who has forgotten more about security than Blunkett and his idiots ever knew in the first place — does a memorable take-down of the idea that ID cards contribute to security. It makes for sobering reading:

My objection to the national ID card, at least for the purposes of this essay, is much simpler.

It won’t work. It won’t make us more secure.

In fact, everything I’ve learned about security over the last 20 years tells me that once it is put in place, a national ID card program will actually make us less secure.

My argument may not be obvious, but it’s not hard to follow, either. It centers around the notion that security must be evaluated not based on how it works, but on how it fails.

It doesn’t really matter how well an ID card works when used by the hundreds of millions of honest people that would carry it. What matters is how the system might fail when used by someone intent on subverting that system: how it fails naturally, how it can be made to fail, and how failures might be exploited.

Read the rest if you want the gory details. Basically, it’s not good. And that’s before you factor in the stupendous price of the scheme ( 70 per person? You gotta be kidding!) and the security apparat to administer it and the headaches when it goes wrong or is incorrectly trusted, never mind the civil liberties implications.

The authoritarian weakness is to assume a sweeping solution to a perceived problem will, in fact, solve it — rather than introducing new loop-holes. And this looks to be a classic case of shoot-self-in-jackboot.

(Have I plugged Bruce’s book Beyond Fear: Thinking Sensibly About Security in an Uncertain World yet? If not, consider it plugged. Go. Read it. Open your eyes and see how we’re screwing up. It’s seminal.)

[Discuss ID Cards]

[Charlie’s Diary]