Learning from cases; getting smarter by design

One of my enduring memories from my first days in business school is a video interview of a second year student offering advice on surviving the case method. While I didn’t fully appreciate it at the time, it was totally appropriate that his advice was a case study in its own right.

He began with a story of the business challenge of “sexing chickens.” In the chicken business, it turns out that it is important to separate roosters and hens when both are no more than puffs of feathers. The clues are subtle and not reducible to a checklist. Novice chicken sexers can’t be taught to do their job but they can learn.

Prospective chicken sexers go through an apprenticeship. They sit down with a batch of chicks, pick one up, flip it over, inspect, and guess. Sitting next to our prospect is a veteran chicken sexer. The veteran’s job is to give the prospect feedback; either a “yes” for a correct guess or a smack in the head for a mistake. After a hundred or so guesses, the prospect’s error rate will drop close to zero. Neither the prospect or the veteran will be able to offer an explicit theory of what they are doing, yet they are effective.

Learning by the case method is a similar process of guessing and rapid, memorable, feedback. As an aside, recognize that this also describes the essence of machine learning. It is experiential learning at its purest.

The craft in designing case method learning lies in selecting and sequencing cases so that the lessons can be delivered more rapidly and reliably than the random accumulation of experience permits. The assumption here is that there is an order that can be exploited to guide action. There must be an underlying pattern that one can solve for.

If you subscribe to the value of the case method as a learning strategy, you are making a claim that there is a balance between theory and practice to be managed. It is a claim that the particulars matter; that experience or theory can only go so far in crafting a response. That you have a responsibility to design a response that acknowledges that every situation is a mix of old and new, predictable and unpredictable.

There’s a notion here that I am struggling with that has to do with the rate of change. I think that case-based learning potentially makes this issue more visible.

In a slow-changing world, experience matters greatly. Recognize how this situation maps to what we’ve seen and responded to before and right action is clear. As the rate of change increases, the value of experience changes. Prior experiences suggest actions and responses that can serve as the basis for designing a modified response that blends old and new.

Pushing the rate of change still higher means that effective response demands more design and less “here’s what we’ve done before.” What does that imply for learning effectively?

My hypothesis is that we need to make the experiential learning process visible, explicit, and deliberate. In a conventional case-based learning environment, there is a separation between those who are learning and those who are facilitating the learning—which is not the same thing as teaching. The goal is for those learning to build a robust, internal, theory of action. The facilitators have strong ideas about what that theory should look like and cases are sequenced to force the learners to develop an internal theory that matches the target theory.

What’s happening in higher-paced environments is that our learners and facilitators are becoming harder to distinguish. In a sense, we are back to a world of pure learning from experience. What changes is that as learners we now must be responsible for building our theories dynamically.

While this can still be described as a form of reflective practice, that reflection now must operate at several levels of abstraction. We can’t rely simply on organic processes to slowly and unpredictably get smarter over time. We need to get smarter on purpose and by design.

The Siren Call of Proven Systems

Seductive SirensOur admiration for the assembly line is so deep that we are suckers for the promise of “proven systems” regardless of their feasibility. We so treasure predictability and control that the promise seduces no matter how many times it is broken.

I started my career at what ultimately morphed into Accenture. That tenure coincided with the development of one of the early systems development methodologies—Method/1. The methodology was an attempt to make the process of understanding, designing, and implementing a technology solution to an organization’s business problem something that was manageable and repeatable. Method/1 made Accenture a lot of money and can be seen as a distant ancestor of a host of systems for building systems; agile, scrum, rational unified, waterfall, extreme programming—the ingenuity of marketers in packaging and labeling ideas is endless.

All are variations on a theme. They embody a desire for control in a turbulent world. If Ford could manufacture identical cars and McDonald’s could guarantee the consistency of fries and shakes from coast to coast, then we ought to be able to turn out information systems with similar confidence in quality and predictability.

There seems to be an uptick in promises of proven systems in multiple settings; not simply in the arena of design and development. Given their appeal, it’s critical to recognize the limits of these promises.

The first limit is the dangerously thin data from which these proven systems are built. One of the first ideas pounded into your head when you are compelled to study statistics and research methods is that “data” is not the plural form of “anecdote.” Yet most proven systems are based on a handful of prior examples that happened to work.

Look at the history of Method/1. The late accounting firm Arthur Andersen & Co.—which birthed Accenture—automated the payroll department of a GE manufacturing plant in the 1960s. That first project failed and Andersen rebuilt the system at a loss to make good. Andersen’s sales pitch to their second client boiled down to we’ve learned what mistakes to avoid while our competitors have yet to make them. Several projects later, Andersen consultants documented what they had done and packaged their ad hoc approach into a standard project plan for internal use called the “client binders.” As an accounting firm, Andersen was accustomed to protecting client identities and to thinking of audit processes as something common and repeatable for all clients. Consequently, the client binders were devoid of client specifics and mirrored the look and feel of standardized, repeatable audit processes.

In spite of their limitations and the thin data they were built on, the client binders gave Andersen’s consultants a better starting point for new projects and helped Andersen extend and consolidate their lead in the market. They worked best in the hands of experienced consultants who could use these materials to organize and support productive conversations with clients and prospects about how to structure and manage new efforts.

Then the methodology zealots took over. The experts took the crib notes that were valuable to experts and rewrote them into recipes for reasonably smart people with limited experience to follow. Repeatable, industrial, processes promise good economics to those who invent them and acceptable quality to those who seek the outputs.

The fundamental problem is that today’s knowledge work doesn’t consist of repeatable, industrial, processes no matter how much we wish they did or how often we claim they do. I’ve written before about the problems this strategy presents (Repeatable Processes and Magic Boxes).

Where does that leave us?

Be suspicious of claims about proven systems. Look for the demands for creative leaps and flashes of insight hiding within seemingly innocuous steps. Look for potentially endless cycles of analysis with no stopping rule. Find the magic box. Be wary of maps that don’t show where the roads end and the dragons hide.

Searching for the secret class

permanent whitewaterThere was a time when I wanted to get my hands on the syllabus for the “secret class;” the secret being how to navigate the real world outside the classroom. Think of me as a male version of Hermione Granger; annoyingly book smart and otherwise pretty clueless. Classes were easy; life not so much.

Most of us spend enough time between classes in hallways and playgrounds to soak up the necessary experiences to get a clue. Through a peculiar set of circumstances and events, I was late in encountering and absorbing these experiences. My wife is on record that she would have crossed the street to avoid that earlier me. My therapist assures me that these lessons are learnable as long as I put my heart in circuit between my brain and my mouth.

Because this arena was foreign to me, I had to study it much as an anthropologist might; observing, cataloging, and making sense of what I see. The biggest risk for an anthropologist is to “go native”; to leave the edge and to immerse themselves in the action. Life demands immersion.

I make no claims about life in general. Life within organizations, however, is a place I understand. Thinking about secret classes and learning in organizational settings turns out to be a fruitful path. Really smart people, like Chris Argyris and Donald Schon among others, have thought a great deal about the things people need to learn within organizations. They talk about skills and perspectives that are effectively secrets and acquired by way of experience rather than classrooms.

Experience alone is rarely sufficient to impart these lessons; managers and executives become effective by way of reflective practice. They must process and digest experience to transform it into effective managerial practice. Classic examples of this deliberate reflection are Chester Barnard’s The Functions of the Executive and Alfred Sloan’s My Years with General Motors.

These are valuable and insightful analyses of complex organizations and executive work. They also highlight critical ways that reflective practice must evolve. It’s a common trope that organizations operate in an increasingly fast and complex environment. It’s common because examples to demonstrate it surround us. In the early days of my career, we were still converting paper-based business processes to electronic. Today, we’re knitting together digital processes spanning multiple organizations and continents.

That plants us in a world where the volumes of digital data threaten to collapse into a wall of noise and this wall feels more like  the leading edge of a tsunami rather than a fixed landmark somewhere “over there.” Simply coping with the onslaught consumes our attention and drives too many of us and our organizations into reactive mode. We become driven by events. Planning feels like a luxury and the notion of reflecting seems an unrealistic, academic, dream. Peter Vaill makes a compelling argument that we now live in a world of permanent whitewater and have to learn to operate accordingly.

We have a conundrum then. A changing world demands changed responses, yet the pace of change leaves no time for the reflection needed to transform experience into new practice. What are the elements of a strategy that might even our odds? How do we make reflective practice work in the organizational environment that now exists?

Peter Vaill offers a clue in the title of the book where he talks of permanent whitewater, Learning as a Way of Being. Learning is not something neatly distinguishable from practice; we pursue learning while doing. Elsewhere, I’ve discussed the notion of observable work as a pre-condition for making that kind of learning possible. Then, there are various techniques, such as after action reviews that should be in any knowledge worker’s toolkit.

There is no “secret class.” Even it one existed, it wouldn’t make sense to take it. Instead, we structure experience so that learning is a continuing parallel element of doing the work.

Effective Executives Are Design Thinkers

DruckerEffectiveExecThe Effective Executive: The Definitive Guide to Getting the Right Things Done (Harperbusiness Essentials) (Peter F. Drucker and Jim Collins)

There are certain smart, articulate, thinkers that provide anchors for my thinking; Peter Drucker is high on that list. Somewhere around the time I was figuring out that organizations caught and held my attention, I also stumbled upon Drucker.

Not too long ago, I added a copy of The Effective Executive to my Kindle and chose to revisit Drucker’s observations about effectiveness. I’ve been uncomfortable about the surge in attention around efficiency and productivity of knowledge workers and thought Drucker might have relevant insight.

He does.

Drucker was credited with coining the term knowledge worker and the bulk of his work focused on why they mattered and what to do about it. The Effective Executive was originally published in 1966 and was reissued in a 50th anniversary edition. It’s a bit scary to contemplate the level of insight available for the taking.

Organizations consist of two kinds of people; those who follow scripts and those who write them. For a long time—and certainly in 1966—the overwhelming majority of headcount fell into the first category. Drucker’s attention was always drawn to those who wrote the scripts—executives. What’s changed since then is that scripts have become the responsibility of software. Organizations are gradually—and not so gradually—eliminating people who follow scripts.

That makes Drucker’s observations and advice about the script writers orders of magnitude more important than it was then. Drucker argued that “working on the right things is what makes knowledge work effective.” Doing the right thing is more important than doing things right, regardless of how much of the visible activity in organizations seems to be about doing things right. The Effective Executive is Drucker’s extended examination of how to systematically go about doing the right thing.

Drucker’s worldview was not tainted by today’s slavish devotion to shareholder value. In Drucker’s world, organizations exist to create and serve customers; there is no possibility of value creation, much less maximization, until a customer exists. And customers exist outside the organization. This creates a fundamental challenge for executives, which Drucker characterizes as follows:

Every executive, whether his organization is a business or a research laboratory, a government agency, a large university, or the air force, sees the inside—the organization—as close and immediate reality. He sees the outside only through thick and distorting lenses, if at all. What goes on outside is usually not even known firsthand. It is received through an organizational filter of reports, that is, in an already predigested and highly abstract form that imposes organizational criteria of relevance on the outside reality.

He goes on to say that

The fundamental problem is the reality around the executive. Unless he changes it by deliberate action, the flow of events will determine what he is concerned with and what he does.

Regardless of the changing dynamics of organization and environment, executive effectiveness consists of judgments whether a particular situation fits within the parameters of existing organizational scripts, will yield to one-time improvisation, or calls for developing a new script.

What this implies, and what Drucker argues, is that

effective decision is always a judgment based on “dissenting opinions” rather than on “consensus on the facts.” And [effective executives] know that to make many decisions fast means to make the wrong decisions. What is needed are few, but fundamental, decisions. What is needed is the right strategy rather than razzle-dazzle tactics.

Several things flow from that. Most importantly, effective decisions require meaningful chunks of time. Here, Drucker’s thinking connects closely with Cal Newport’s more recent discussion of deep work.

As is so often the case with Drucker, his insights get picked up and repackaged. Drucker’s analysis of effectiveness means that executives are engaged in design thinking. More importantly, it is design thinking that strives to synthesize the analysis and insights of multiple specialized perspectives.

There is much more in this short book. It bears close reading and regular re-reading if you aspire to do meaningful executive work. To wrap this up, I want to examine one aspect of this design thinking perspective that appears to run counter to rhetoric of many strategy efforts and many consulting proposals and reports. Let’s look at Drucker’s own words once again,

To get the facts first is impossible. There are no facts unless one has a criterion of relevance. Events by themselves are not facts.

The only rigorous method, the only one that enables us to test an opinion against reality, is based on the clear recognition that opinions come first—and that this is the way it should be. Then no one can fail to see that we start out with untested hypotheses—in decision-making as in science the only starting point. We know what to do with hypotheses—one does not argue them; one tests them. One finds out which hypotheses are tenable, and therefore worthy of serious consideration, and which are eliminated by the first test against observable experience.

The effective decision-maker, therefore, organizes disagreement. This protects him against being taken in by the plausible but false or incomplete. It gives him the alternatives so that he can choose and make a decision, but also so that he is not lost in the fog when his decision proves deficient or wrong in execution. And it forces the imagination

If finding the time and the money to pursue an MBA is currently difficult, consider adding The Effective Executive to your reading stack and put it near the top.

Don’t connect the dots; solve for pattern

Last time, we looked at the limits of a rhetorical strategy (”connecting the dots”) as a tool for planning. There are too many dots forming too many possible pictures, yet we want to take advantage of what we know has gone before to set a way forward. There is a better way that resides in “solving for pattern” a phrase I first encountered in an essay by Wendell Berry.

During my first weeks of business school, marketing mystified me. I had a decent background in accounting and project management; those classes I could make sense of. In marketing, I was lost during the discussion. When the professor laid out the analysis at the end I was no more enlightened than I had been eighty minutes earlier.

I concluded that the fatal flaw of the case method was its failure to provide neat pictures I could lay over the dots scattered throughout the case. Shortly before the midterm exam, a second-year student offered a review session with a roadmap for how to make sense of any marketing problem. I now had the picture that I could lay over the dots and connect them myself. The faculty was very distressed by this review session. We thought it was because their secrets had been revealed.

Perhaps.

I still made a mess of the midterm.

It was much later, after I worked as a case writer, when I began to grasp that the faculty was pursuing a deeper agenda. The case method is not about learning how to connect the dots; it’s about learning how to solve for pattern. That second-year student’s crib sheet was no help for dealing with a supply chain problem masquerading as a marketing problem.

Elements of Solving for Pattern

Organizations are stuffed with people ready to solve problems once they’ve been labeled. What is rare—and more valued—is the ability to frame and structure problems so that they can be solved. There are no simple pictures hiding in the environment waiting to be revealed and pointing to pre-determined responses. The proliferation of dots are signals in the environment worth attending to; ignoring them is not wise. If we’re not looking for dots to connect and trigger responses, what are we to do? Solving for pattern is a problem framing and response design process that better fits the world we occupy.

Wendell Berry introduced the term in an effort to contrast industrial farming strategies with family farming strategies. The strength of the strategies Berry advocated came not from swapping corporate managers for family ownership but from treating farms as complex systems rather than faux factories. As a trivial example, Berry points to replacing chemical fertilizers on crops with the manure produced by the milk cows feeding off those same crops. Connecting those two isolated systems into a larger dynamic system produces better outputs, reduces costs, and has other positive effects that ripple through other components of the overall farm system.

Much of solving for pattern is a change of perspective. It starts by viewing the current situation as a dynamic equilibrium producing outcomes that are a mix of desirable and undesirable results. The goal is not to fix a component in a bigger machine but to push the system into a new stable equilibrium with a better balance of desirable and undesirable outcomes.

This strategy of solving for pattern melds pattern recognition, systems thinking, design skill, and change management practices to effect these rebalancing efforts. Pattern recognition and systems thinking are the tools we need to understand where we are and the trajectory we are on. That understanding feeds into the design and change management efforts needed to envision and shift from the pattern we have to the pattern we want.

In his essay, Berry was exploring the complexities of managing the miniature ecosystem of a working farm. Whether you call them problems, opportunities, or messes, the situations that arise on farms —and organizations—never come with labels or instructions attached. Nor are they neatly isolated from whatever else may be going on. Tackling those situations effectively starts with choosing a better point of view.

The limits on connecting the dots

connect the dots exampleI was a big fan of Sherlock Holmes and various other fictional detectives growing up; I’m still drawn to the form when I want to relax. There’s a basic pleasure in trying to match wits with Sherlock and figure our who the killer is before the final reveal. Connecting the dots is a rewarding game but it’s a flawed strategy.

The picture is always clear after the fact. But to jump from that fact to the assumption that you can recognize the dots earlier and connect them into one single, compelling, inarguable picture that triggers right action is a leap too far. Detective stories have an advantage that truth lacks; the writer knows where they intend to go.

The problem is that connecting the dots is such a well worn story telling and story organizing technique. The key to using this as a story telling technique is to manage the ratio of story dots and noise dots. Looking backward to make sense of the outcome that came to pass, the storyteller emphasizes the dots that reveal the picture and shares just enough other dots to suggest the work our hero had to do to see the hidden picture. As a good storyteller you might add a clever twist or put our hero in just the right spot to see what others have missed.

Suppose, however, that in the real world our potential hero was looking in the wrong direction or trying to separate the relevant dot from a thousand irrelevant but similar dots. Now our hero might be portrayed as a fool or an incompetent. Our hero’s reputation and future hangs on the objectives of the storyteller, not on the events in the field.

The appeal of the connecting the dots rhetorical strategy obscures its flaws as a thinking strategy. There are two principal flaws from a problem-solving perspective. One, it suggests there is a single, correct, picture of what is going on waiting to be discovered as soon as someone draws the correct thread through the data points that otherwise appear to be noise. Two, it presumes that recognizing the picture offers sufficient, timely, insight into appropriate action steps.

There are always too many dots; in a big data world we seek more dots in the hope that the picture will emerge. All the analytical tools will do, however, is add more possible pictures that might be hiding in the dots. Any collection of dots might form multiple pictures; more dots simply suggest more pictures.
Which brings us to the second problem; how does recognizing a picture connect to effective response? In our detective stories, our goal is to bring the killer to justice, perhaps to intervene before the killer strikes again.

In the world of more dots than time, our goal is to devise a response soon enough to make a difference. We wish to create a new story to be told. The dots behind us are only useful in suggesting where to place the dots that are not yet part of the picture we are in the act of creating.