A reader’s guide to Clay Christensen and disruptive innovation

[cross posted at FASTforward blog]

A dozen years ago, at the height of the dotcom boom, Harvard Business School professor Clay Christensen published The Innovator’s Dilemma. It started from a simple observation that transformative innovations that reshaped competitive landscapes and created new industries almost invariable came from new organizations. Conventional wisdom held that this was a reflection of poor management and decision making on the part of incumbents. Christensen started with a more interesting, and ultimately more productive, question. What if it was sound management practice on the part of incumbents that prevented them from investing in those innovations that went on to create new industries? This question and Christensen’s research led to his distinguishing disruptive vs. sustaining forms of innovation. I originally reviewed the book in the Spring 1998 issue of Context Magazine. It became the bible of consulting firms working in the dotcom space. Every proposed idea was labeled as disruptive. Who knows, some of those consultant’s might even have read the book.

Meanwhile, Christensen and his colleagues and collaborators continued to work out the ideas and implications of his emerging theoretical framework. The Innovator’s Dilemma was followed by

  • The Innovator’s Solution: Creating and Sustaining Successful Growth.

    In this book, Christensen begins to lay out how you can take the notions of disruptive innovation and use them to design a reasonable course of action in the absence of the kind of analytical data strategy consultants desire. Disruptive innovations attack either the lower ends of existing markets where there are customers willing to settle for less performance at less cost, or new markets where a new packaging and design of available technologies creates an alternative to non-consumption. The example I found easiest to understand here was Sony’s invention of the portable transistor radio. Compared to vacuum tube radios the first transistor radios were crappy, but good enough for teenagers and others on the go whose alternative was no music at all.

  •  Seeing What’s Next: Using Theories of Innovation to Predict Industry Change.

    In this third effort to work out the implications of distinguishing between sustaining and disruptive innovation, Christensen and his collaborators shift their attention from individual competitors to industry level analysis. They take their theoretical structures and apply them across several industry settings and ask how those particular industries (education, aviation, health care, semiconductors, and telecommunications) are more or less vulnerable to disruptive innovation strategies. What Christensen and colleagues are doing here is to begin integrating their innovation theories and Porter’s theories of competitive strategy. This is not so much a case of seeing whether their new theoretical hammer can pound strategy nails as it is of whether they are making progress in creating a new and robust toolkit for strategy problems.

  • The Innovator’s Guide to Growth: Putting Disruptive Innovation to Work, Anthony, Scott D.

    This volume is written by Scott Anthony and several other collaborators of Christensen who are putting his ideas to work at the consulting firm Innosight. They develop the next level of operational detail to transform strategic insights into execution details. If you re an organization seeking to develop its own disruptive strategy, the authors here have worked out the next level questions and identified the supporting analyses and design steps you would need to answer and complete. This volume is not a teaser; it s complete and coherent. You could pretty much take the book as a recipe and use it to develop your project plans. On the other hand, the plans by themselves won t guarantee that you can assemble a team with the necessary qualifications to execute the plan successfully. The other thing that this book does quite nicely is identify the kinds of organizational support structures and processes that you would want to put in place to institutionalize systematic disruptive innovation.

This core of books would equip you with a robust set of insights and practical techniques to begin thinking about when and where you might attempt to develop and deploy new products, services, and business models in disruptively innovative ways. The one area that is underdeveloped in this framework is that of design. There is an implicit bias in the material that tends to keep design in the "perform magic" category. I believe this is part and parcel of the general execution bias of business literature in general. Design is flaky, creative, stuff and real managers distinguish themselves on execution. But that is a topic for another post. These books belong on your shelf and the ideas belong in your toolkit.

Gary Hamel and innovations in management

The Future of Management, Hamel, Gary

 

Gary Hamel has been an astute observer of organizations and management for several decades now. For all the reasons that seemed to make sense at the time, this book sat on my shelf for a while before I got to it. Based on the current state of the economy, I suspect a number of executives who could have benefitted from Hamel’s insights also failed to get them in a timely fashion. Hamel’s central thesis is that management is a mature technology and is ripe for disruptive innovation. Although he makes only passing reference to Clay Christensen’s work, there are important points of linkage between these two management thinkers.

The underlying rationale behind management philosophy and practices was largely laid down in the early decades of the twentieth century during the growth and ascendancy of the large multi-divisional industrial organization. In other words, most managers continue to operate with the mindset and practices originally developed to handle the problems encountered by the railroads, GM, IBM, and the other organizations making up the Dow Jones average between 1930 and 1960. While we’ve experienced multiple innovations in products, technologies, services, and strategies, the basics of management have changed little. Here’s how Hamel puts it:

While a suddenly resurrected 1960s-era CEO would undoubtedly be amazed by the flexibility of today’s real-time supply chains, and the ability to provide 24/7 customer service, he or she would find a great many of today’s management rituals little changed from those that governed corporate life a generation or two ago. Hierarchies may have gotten flatter, but they haven’t disappeared. Frontline employees may be smarter and better trained, but they’re still expected to line up obediently behind executive decisions. Lower-level managers are still appointed by more senior managers. Strategy still gets set at the top. And the big calls are still made by people with big titles and even bigger salaries. there may be fewer middle managers on the payroll, but those that remain are doing what managers have always done–setting budgets, assigning tasks, reviewing performance, and cajoling their subordinates to do better. (p. 4)

Hamel sets out to explore what innovation in the practice of management would look like and how organizations and managers might tackle the problems of developing and deploying those innovations. I don’t think he gets all the way there, but the effort is worth following.

The first section of the book lays out the case for management innovation as compared to other forms. the second examines three organizations that Hamel considers worthy exemplars: Whole Foods, W.L. Gore, and Google. The last two section build a framework for how you might start doing managerial innovation within your own organization.

Hamel does a good job of extracting useful insights from the case examples he presents. Hamel’s own preference is for a managerial future that is less hierarchical and less mechanical. At the same time, he wants each of us to commit to doing managerial innovation for ourselves. This leaves him in a bit of a bind. I suspect that Hamel would like to be more prescriptive, but his position forces him to leave the prescription as an exercise for the reader. While I agree with Hamel that both individuals and organizations need to be formulating their own theories of management and experimenting on their own, this is not likely to happen in most organizations and particularly so in the current economic climate. Necessity is not the mother of invention; rather it forces us to cling to the safe and familiar. We need a degree of safety and a degree of slack to do the kinds of thinking and experimenting that will produce meaningful managerial innovations. I fear that may be hard to come by in the current environment; no matter how relevant or necessary.

What you can do in the interim is research and reflection to discover or define opportunities for possible managerial innovations. This book is one excellent starting point, but insufficient on its own.

Is this an agenda worth pursuing? What else would you recommend to move forward?

Tools for tackling wicked problems: Review of Jeff Conklin’s “Dialogue Mapping”

Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them.

– Laurence Peter

 

[Cross posted at FASTforward]

Dialogue Mapping: Building Shared Understanding of Wicked Problems, Conklin, Jeff

However you’re paying attention to the current external environment — the nightly news, newspapers, blogs, Twitter, or the Daily Show — it’s a grim time. While there is a great deal of noise, there’s not as much light as you might like. Dialogue Mapping, by Jeff Conklin, is one effort to equip us with tools for creating more light. While Conklin started out doing research on software for group decision support that research led him into some unexpected places of organizational dynamics and problem structure. He starts with the notion of “fragmentation” as the barrier to coherent organizational action. He defines fragmentation as “wicked problems x social complexity.”

I’m often surprised that the term “wicked problem” hasn’t become more common. The notion and the term have actually been around for decades. Horst Rittel at Berkeley coined the term in a paper, “Issues as Elements of Information Systems,” in the 1970s. Rittel identified six criteria that distinguish a particular problem as a wicked one:

  1. You don t understand the problem until you have developed a solution
  2. Wicked problems have no stopping rule
  3. Solutions to wicked problems are not right or wrong
  4. Every wicked problem is essentially unique and novel
  5. Every solution to a wicked problem is a one-shot operation
  6. Wicked problems have no given alternative solutions

Compare wicked problems with tame problems. A tame problem:

  1. Has a well-defined and stable problem statement
  2. Has a definite stopping point
  3. Has a solution that can be objectively evaluated as right or wrong
  4. Belongs to a class of similar problems that are all solved in the same similar way
  5. Has solutions that can be easily tried and abandoned
  6. Comes with a limited set of alternative solutions

Obviously there are degrees of wickedness/tameness. Nevertheless, the real world of politics, urban planning, health care, business, and a host of other domains is filled with wicked problems, whether we acknowledge them as such or not. All too often, wicked problems go unrecognized as such. If you do recognize a problem as a wicked one, you can choose to attempt to tame it to the point where you might be able to solve it. Some ways to tame a wicked problem include:

  • Lock down the problem definition
  • Assert that the problem is solved
  • Specify objective parameters by which to measure the solution s success
  • Cast the problem as just like a previous problem that has been solved
  • Give up on trying to get a good solution to the problem
  • Declare that there are just a few possible solutions, and focus on selecting from among these options

These are the kinds of problem management strategies frequently seen in organizations. Conklin provides a good case that we and organizations would be better off if we were more explicit and mindful that this is what we were up to. That isn’t always possible and brings us to Conklin’s second element driving fragmentation: social complexity. Independently of the problem features that make them wicked problems, problems also exist in environments of multiple stakeholders with differing worldviews and agendas.

This social complexity increases the challenge of discovering or inventing sufficient shared ground around a problem to make progress toward a resolution or solution. This is where Conklin’s book adds its greatest value by introducing and detailing “Dialogue Mapping,” which is a facilitation technique for capturing and displaying discussions of wicked problems in a useful way.

Assume that someone recognizes that we have a wicked problem at hand and persuades the relevant stakeholders to gather to discuss it and develop an approach for moving forward. Assume further that the stakeholders acknowledge that they will need to collaborate in order to develop that approach (I realize that these are actually fairly big assumptions). More often than not, even with all the best of intentions, the meetings will produce lots of frustration and little satisfying progress. Our default practices for managing discussions in meetings can’t accommodate wicked problems, which is one of the reasons we find meetings so frustrating.

“Dialogue Mapping” takes a notation for representing wicked problems, IBIS (short for Issue-Based Information System) and adds facilitation practices suited to the discussions that occur with wicked problems. The IBIS notation was developed by Rittel in his work with wicked problems in the 1970s. It is simple enough to be largely intuitive, yet rich enough to capture conversations about wicked problems in useful and productive ways.

The building blocks of a dialogue map are questions, ideas, arguments for an idea (pros), and arguments against an idea (cons). These simple building blocks, together with what is effectively a pattern language of typical conversational moves, constitute “dialogue mapping.” The following is a fragment of a dialogue map that might get captured on a whiteboard in a typical meeting:

DialogueMapExample

While the notation is simple enough, learning to use it on the fly clearly takes some practice. Some starting points for me are using it to process my conventional meeting notes and beginning to use the notation while taking notes on the fly. I’m not yet ready to employ it explicitly in meetings I am facilitating, especially given Conklin’s advice that the technique changes the role of meeting facilitator in some significant ways.

When applied successfully in meeting settings, Conklin argues that dialogue mapping creates a shared representation of the discussion that accomplishes several important things:

  1. Allows each individual contributor to have their perspective accurately heard and captured
  2. Reduces repetitious contributions by having a dynamic, organized, and visible record of the discussion. Attempts to restate or remake points that have already been made can be short circuited by reference to the map
  3. Digressions or attempts to question the premises of a discussion can be simply accommodated as new questions that may not, in fact, fit immediately in the current map tree. They can be addressed as they surface and located appropriately in the map. Or they may be seen as digressions to be addressed briefly and then the discussion can pick up in the main map with little or no loss of progress.

Much of the latter part of the book consists of showing how different conversational “moves” play out in a dialogue map. Assuming you are working with organizations that actually want to tackle wicked problems more productively, understanding these moves is immensely illuminating. Actually, it’s also illuminating if you’re in a setting where the incumbents aren’t terribly interested in the value of shared understanding. In those settings, you might need to keep your dialogue maps to yourself.

There are two software tools that I am aware of designed to support dialogue mapping. One is a tool called Compendium, which grew out of Conklin’s research. It is available as a free download and is built in Java, although it is not currently open source from a licensing point of view. The other is commercial tool called bCisive, developed in Australia by Tim van Gelder and the folks at Austhink. Here’s what a dialogue map in Compendium would look like. This particular map is a meta-map of the dialogue mapping process from 50,000 feet.

DialogMappingMetaMap-2008-12-21-2304

As I’ve spent time developing a deeper understanding of wicked problems and dialogue mapping it’s becoming clear that we have more of the former to tackle and we need the tools of the latter to wrestle with them. In this world, decisions don’t come from algorithms or analysis; they emerge from building shared understanding. In this world, to quote Conklin’s conclusion, “the best decision is the one that has the broadest and deepest commitment to making it work.” These are the tools we need to become facile with to design those decisions.

Business models for health care: Andy Kessler’s take on the future of medicine

The End of Medicine: How Silicon Valley (and Naked Mice) Will Reboot Your Doctor, Kessler, Andy

 

Andy Kessler is a former Wall Street investment analyst turned author. He learned his trade following Silicon Valley and its successful, long-term, obsession with Moore’s Law. In that world, as technology scales, costs fall predictably, and new markets emerge. In The End of Medicine, Kessler takes to the world of health care and medicine to discover how and where that underlying investment model might apply. It’s an interesting premise and, despite some annoying stylistic quirks, Kessler delivers some real value. It doesn’t get to anything remotely like an answer, but it collects and organizes a lot of useful information that might help us get closer to one.

Kessler opts for a highly anecdotal style; presumably to put a more human face on a large, complex, subject. For me, he overshoots the mark and loses the big picture. The color commentary overwhelms the underlying story line, which was my primary interest. But there is a good story line that is worth finding and holding on to.

Medicine’s roots are in making the sick and injured better. Triage is baked into the system at all levels. Observe symptoms, diagnose problem, apply treatment, repeat. Over time we’ve increased our capacity to observe symptoms and have gotten more sophisticated in the treatments we can apply, but the underlying logic is based on pathology. Also over time, a collection of industries have evolved around this core logic and these industries have grown in particularly organic and unsystematic ways.

Kessler runs into these roots and this logic throughout his journey. However, coming from the semiconductor and computer industries, as he does, he doesn’t fully pick up on their relevance. As industries, computers and semiconductors are infants compared to medicine and health care. Not only do Kessler’s industries operate according to Moore’s Law, but they are structurally designed around it. His analysis of health care identifies a number of crucial pieces, but he stops short of assembling a picture of the puzzle.

Kessler focuses much of his attention on developments in imaging and diagnostics. Both areas have seen tremendous advances and hold out promises of continued technological development similar to what we’ve seen in semiconductors.

Imaging is a computationally intensive area that benefits fully as an application of computing technologies. What is far less clear is whether the current structure of the health care industry will be able to absorb advances in imaging technologies at the pace that will let Moore’s Law play out in full force.

There is a second problem with imaging technologies that applies equally to other diagnostic improvement efforts. As we get better and better at capturing detail, we run into the problem of correctly distinguishing normal from pathological. While we may know what a tumor looks like on a mammogram what we really want to know is whether that fuzzy patch is an early warning sign of a future tumor or something we can safely ignore. The better we get at detecting and resolving the details of smaller and smaller fuzzy patches, the more we run into the problem of false positives; finding indicators of what might be a tumor that turn out on closer inspection to be false alarms. Our health care system is organized around pathologies; we fix things that are broken. Because of that, the data samples we work with are skewed; we have a much fuzzier picture of what normal looks like than what broken looks like.

This is the underlying conceptual problem that efforts to improve diagnostics and early detection have to tackle. Kessler devotes much of his later stories to this problem. He profiles the work of Don Listwin, successful Silicon Valley entrepreneur, and his Canary Fund efforts. Here’s the conundrum. If you detect cancers early, treatment is generally straightforward and highly successful. If you catch them later, treatment is difficult and success is problematic. Figuring out how to reliably detect cancer early has a huge potential payoff.

The kicker is the word “reliably” and the problem of false positives, especially as you begin screening larger and larger populations. If you have a test that is 99% accurate, then for every 100 people you screen you will get the answer wrong for one person. The test will either report a false positive – that you have cancer when, in fact, you don’t – or a false negative – that you are cancer-free when you aren’t. As you pursue early detection, the false positive problem becomes the bigger problem. Screen a million people and you will have 10,000 mistakes to deal with, the vast majority of which will be false positives. That represents a lot of worry and a lot of unnecessary expense to get to the right answer.

Kessler brings us to this point but doesn’t push through to a satisfactory analysis of the implications. Implicitly, he leaves it as an exercise for the reader. His suggestion is that this transition will present an opportunity for the scaling laws he is familiar with to operate. I think that shows an insufficient appreciation for the complexities of industry structure in health care. Nonetheless, Kessler’s book is worth your time in spite of its flaws.

Reblog this post [with Zemanta]

A workbook on doing disruptive innovation effectively

[cross posted at FASTForward Blog]

The Innovator’s Guide to Growth: Putting Disruptive Innovation to Work, Anthony, Scott D. et.al.

The Innovator’s Guide to Growth is the newest installment in a series of books articulating and explicating Prof. Clay Christensen’s theory of disruptive innovation. This hands on guide packages some of the insights developed as an outgrowth of the consulting work of Innosight, LLC, the consulting firm founded by Christensen to pursue the practical insights from his research at the Harvard Business School. If innovation is part of your current or prospective job description, this needs to be on your shelf (after you’ve read it, of course).

Christensen’s theories of disruptive innovation appeared first with the publication of The Innovator’s Dilemma in 1997. During the worst excesses of the dotcom boom, every start up business plan including an obligatory head nod to Christensen and an assertion that their business model was truly disruptive. Who doesn’t want to be innovative; ideally disruptively so. Christensen and his colleagues have continued to develop his theories in The Innovator’s Solution: Creating and Sustaining Successful Growth, Seeing What’s Next: Using Theories of Innovation to Predict Industry Change, and now The Innovator’s Guide to Growth.

Christensen distinguishes two forms of innovation — sustaining and disruptive — not in terms of their technological features but in terms of their relationship to markets. The distinction in summarized in the following diagram reproduced from The Innovator’s Guide to Growth.

Christensen-DisruptiveInnovationModel

In essence, Christensen’s theory of disruptive innovation flows from recognizing that the pace of technology improvement is generally more rapid than the capacity of users in the market to take advantage of those improvements. This differential is what open possibilities for differing approaches to innovation.

In this market oriented theory of innovation, there are three paths available to organizations interested in articulating potentially disruptive strategies. The first is to identify and target “nonconsumers;” potential consumers for whom existing technologies fail to meet their particular needs. The second is to identify existing customers where existing technologies are more technology than they needs. The final is to investigate potential consumers in terms of what Christensen’s theory describes as “jobs to be done” as a path to defining new products and services to perform these jobs. I must confess that I still find this path the least well articulated aspect of this theory.

Throughout this book, the authors start by recapping the essentials of Christensen’s theoretical arguments and proceed to develop the next level of operational detail it takes to transform strategic insights into execution details. If you’re an organization seeking to develop its own disruptive strategy, the authors here have worked out many of the next level questions and identified the supporting analyses and design steps you would need to answer and complete. The authors are clearly competent and talented consultants who are willing to share how they manage and do their work. Their hope, of course, is that many of you will conclude that you need their help to do the work. What is nice here, is that they are confident enough in their abilities that they are quite thorough in what they share. This volume is not a teaser; it’s complete and coherent. You could pretty much take the book as a recipe and use it to develop your project plans. On the other hand, the plans by themselves won’t guarantee that you can assemble a team with the necessary qualifications to execute the plan successfully.

The other thing that this book does quite nicely is identify the kinds of organizational support structures and processes that you would want to put in place to institutionalize systematic disruptive innovation.

Christensen and his colleagues are continuing to build a rich, systematic, theory of disruptive innovation. With roots in academic research, they are freely sharing their insights and their methods. The Innovator’s Guide to Growth is a solid workbook that will let you develop your own skill at doing disruptive innovation. Of course, the plan by itself won’t eliminate the need to gain the experience for yourself. But it’s a lot better strategy than to have to work everything out from scratch on your own.

Learning from the failures of others; billion-dollar lessons for next to nothing

Billion-Dollar Lessons: What You Can Learn from the Most Inexcusable Business Failures of the Last 25 Years, Carroll, Paul B. and Chunka Mui

Progress in science and engineering proceeds from the dispassionate analysis of failure. We learn more when we screw up than when we succeed. However, since Waterman and Peters In Search of Excellence,

Updating ‘be prepared’ for the 21st Century

The Unthinkable: Who Survives When Disaster Strikes – and Why, Ripley, Amanda

Amanda Ripley has taken an interesting premise and turned it into an excellent book. A writer for Time magazine, she’s turned her attention to the lessons to be had from the ordinary people who survive extraordinary situations; those who got out of the World Trade Center on 9/11, who survived the tsunami in 2004, who make it out of burning planes and burning buildings. In place of the “be afraid” messages conveyed by the nightly news and by too many of those in positions of authority, she digs into the psychological dimensions of “be prepared” for the range of risks, real and imagined, that confront today’s average citizen.

There are two overarching messages woven into her fascinating storytelling around disasters big and small. The first is a simple model of the psychology of response (and non-response) to the unexpected threat; an arc of denial, deliberation, and decision. Ripley touches on our generally poor abilities to assess risk, how the physiology of fear interferes with our ability to think, why some people are more likely to be resilient than others, and why panic happens far less often than we think. More importantly, she demonstrates how small doses of attention and both mental and physical rehearsal improve the chances that you will be able to do the right thing should the need arise.

The second theme is about the central importance of regular people who are prepared to act when the moment comes. Through all of Ripley’s stories, whether of the World Trade Center or an ordinary car accident, by the time that official “first responders” and the authorities arrive, it’s too late. When the unexpected occurs, what you have with you and who you are surrounded by are what you get to work with. More often than not, that’s also more than enough.

Of course, there is a website and a blog associated with the book. Both appear to be better than the norm for these sorts of thinks. In particular, I’ve found the Unthinkable Blog to be worth adding to my list of RSS subscriptions.

Review of Tom Davenport’s "Competing on Analytics"

Competing on Analytics: The New Science of Winning, Davenport, Thomas H. and Jeanne G. Harris

 

Tom Davenport has turned his attention of late to the prospects for business intelligence and information analytics. Competing on Analytics offers a managerial introduction to the topic. It emphasizes why organizations ought to be interested in the topic, what kinds of payoffs they might expect, and how organizations will need to adapt to take advantage of robust analytics. Davenport and co-author Jeanne Harris of Accenture split the book into two major sections. The first deals with describing how analytics can be used as a competitive tool; the second with the organizational challenges of building analytical capabilities. Overall, it’s a relatively short book and is well-suited to its target audience. On the other hand, if you’re on the receiving end of a mandate to build an analytical capability after someone higher in the food chain has gotten excited about the topic, don’t expect quite as much in the way of detailed implementation advice.

Davenport and Harris set out a stage-model of analytical capabilities starting with "analytically impaired" and ending with "analytical competitors." Partly, this is to support an argument they make that there’s an advantage to managing analytics at an enterprise level. My cynical side suspects that this advantage lies primarily in providing a clear target for the likes of Accenture or SAS to sell to.

Given that every new capability benefits from senior executive attention and that everyone wants to get on the CEO’s calendar, are there, in fact, compelling reasons that analytics deserves to be on this short list? Two come to mind. One is that the expertise called for in effective analytics is scarce. Better to have that expertise directed at the targets of greatest opportunity by those best positioned to judge. Two, the competitive business opportunities that might yield to analytics are more likely to be found from the perspective of those with an integrative view of the enterprise.

The authors walk through major functions of the enterprise identifying opportunities and examples of how analytics have been successfully applied. There are clearly an abundance of opportunities to apply analytical tools and techniques to improving internal processes, optimizing supply chains, and leveraging marketing.

One problem with the focus on describing the business opportunities for analytics is that the variety of potentially applicable tools gets short shrift. All books have to make decisions about what to put in and what to leave out. Given the intended audience, I can understand the decision to focus on the business side of the equation rather than on the tools side. On the other hand, glossing over the complexities of the statistical tools and algorithms has its own risks. Organizations risk creating a new class of wizards whose dark arts must be taken on faith or they risk putting dangerous tools in the hands of amateurs who will be blind to both the limits and the dangers of the tools.

This brings us to the second part of the book and the challenges of building an analytical capability. I

Grounded advice on making better use of your brain

Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School, Medina, John

John Medina is a molecular biologist bent on sharing how what we know about the brain can help us be more effective in the world at large. His central argument is that there are simple, but very important, lessons to be drawn from what science has learned in recent years about how the brain operates. Many of these lessons run counter to the practices and conventions that hold sway in our schools and organizations.

You can be pretty sure that I

Scary thoughts on one all-too-possible near future

Little Brother, Doctorow, Cory

It may be time to get more serious about using PGP and learning about Tor. Little Brother takes us into a near-future version of San Francisco where Jack Bauer has clearly become the U.S. Attorney General. Marcus is a seventeen year-old high school student who likes to play video games, role play, and has it all figured out (he’s 17 after all). In the wrong place at the wrong time, Marcus gets swept up by Homeland Security after a terrorist attack, held for questioning for days, and released with the threat that he’ll be watched carefully to make sure that he behaves.

The world that Marcus returns to has ratcheted up both fear and surveillance to something on the wrong side of police state. He chooses to fight back mostly out of adolescent stubbornness coupled with enough technical expertise to be dangerous to both the nascent police state and to himself and his friends. Doctorow is becoming a better and better storyteller with each of his books and Little Brother motors along. I pretty much dropped everything I should have been doing to plow through it over the course of two days.

The book has its flaws. The bad guys tend to be caricatures. The “lessons” about various hacks and technologies sometimes slow things down, although not as much as you might think. Doctorow may have his agenda but he remembers that his first priority is to entertain. Along the way, he also manages to get you to think about his larger message.