Chaos players: knowledge work as performance art

Stage - Auditorium. Photo by Monica Silvestre from Pexels

All the world’s a stage, And all the men and women merely players; They have their exits and their entrances, And one man in his time plays many parts
As You Like It, Act II, Scene VII. William Shakespeare

I’ve been thinking about the role of mental models for sense-making. While we do this all the time, I think there is significant incremental value in making those models more explicit and then playing with them to tease out their implications. Organization as machine is a familiar example, one that I believe is largely obsolete. Organization as ecosystem or complex adaptive system has grown in popularity. It has the advantage of being richer and more sensitive to the complexities of modern organizations and their environments. On the other hand, that mental model is a bit too appealing to to academic and consulting desires to sound simultaneously profound and unintelligible. It fails to provide useful guidance through the day-to-day challenges of competing and surviving.

Organization as performance art or theatrical production offers a middle ground between simplistic and over-engineered. It appeals to me personally given a long history staging and producing. It’s my hypothesis that most of us have enough nodding familiarity with the theater to take advantage of the metaphor and model without so much knowledge as to let the little details interfere with the deeper value.

The goal of theater is to produce an experience for an audience. That experience must always be grounded in the practical art of the possible. This gives us something to work with.

Let’s work backwards from the performance. We have the players on the stage and an audience with expectations about what they are about to experience. If that is all we have, then we are in the realm of storytelling. Storytelling demands both the tellers of the tale and the creator of the tale itself. Our playwright starts with an idea and crafts a script to bring that idea to life and connect it to all of the other stories and ideas the audience will bring to the experience. We now have a story, its author, storytellers, and an audience with their expectations.

Theater takes us a step farther and asks us think about production values that contribute to and enhance the experience we hope to create. Stage and sets and lighting and sound can all be drawn into service of the story. Each calls for different expertise to design, create, and execute. We now have multiple experts who must collaborate and we have processes to be managed. Each must contribute to the experience being created. More importantly, those contributions must all be coordinated and integrated into the intended experience.

This feels like a potentially fruitful line of inquiry. It seems to align well with an environment that depends on creativity and innovation as much as or more than simple execution. How deeply should it be developed?

Going behind the screen: mental models and more effective software leverage

Osborne 1 Luggable PCI’ve been writing at a keyboard now for five decades. As it’s Mother’s Day, it is fitting that my mother was the one who encouraged me to learn to type. Early in that process, I was also encouraged to learn to think at the keyboard and skip the handwritten drafts. That was made easier by my inability to read my own handwriting after a few hours.

I first started text editing as a programmer writing Fortran on a Xerox SDS Sigma computer. I started writing consulting reports on Wang Word Processors. When PCs hit the market, I made my way through a variety of word processors including WordStar, WordPerfect, and Microsoft Word. I also experimented with an eclectic mix of other writing tools such as ThinkTank, More, Grandview, Ecco Pro, OmniOutliner, and MindManager. Today, I do the bulk of my long-form writing using Scrivener on a Mac together with a suite of other tools.

The point is not that I am a sucker for bright, shiny, objects—I am—or that I am still in search of the “one, true tool.” This parade of tools over years and multiple technology platforms leads me to the observation that we would be wise to spend much more attention to our mental models of software and the thinking processes they support.

That’s a problem because we are much more comfortable with the concrete than the abstract. You pick up a shovel or a hammer and what you can do is pretty clear. Sit down at a typewriter with a stack of paper and, again, you can muddle through on your own. Replace the typewriter and paper with a keyboard and a blank screen and life grows more complicated.

Fortunately, we are clever apes and as good disciples of Yogi Berra we “can observe a lot just by watching.” The field of user interface design exists to smooth the path to making our abstract tools concrete enough to learn.
UI design falls short, however, by focusing principally at the point where our senses—sight, sound, and touch—meet the surface of our abstract software tools. It’s as if we taught people how to read words and sentences but never taught them how to understand and follow arguments. We recognize that a book is a kind of interface between the mind of the author and the mind of the reader. A book’s interface can be done well or done badly, but the ultimate test is whether we find a window into the thoughts and reasoning of another.

We understand that there is something deeper than that words on the page. Our goal is to get behind the words and into the thinking of the author. This same goal exists in software; we need to go behind interface we see on the screen to grasp the programmer’s thinking.

We’ll come back to working with words in a moment. First, let’s look at the spreadsheet. I remember seeing Visicalc for the first time—one year too late to get me through first year Finance. What was visible on the screen mirrored the paper spreadsheets I used to prepare financial analyses and budgets. The things I understood from paper were there on the screen; things that I wished I could do with paper were now easy in software. They were already possible and available by way of much more complex and inscrutable software tools but Visicalc’s interface created a link between my mind and Dan Bricklin’s that opened up the possibilities of the software. I was able to get behind the screen and that gave me new power. That same mental model can also be a hindrance if it ends up limiting your perception of new possibilities.

Word processors also represent an interface between writers and the programmer’s model of how writing works. That model can be more difficult to discern. If writer and programmer have compatible models, the tools can make the process smoother. If the models are at odds, then the writer will struggle and not understand why.

Consider the first stand-alone word processors like the Wang. These were expensive, single function machines. The target market was organizations with separate departments dedicated to the production of documents; insurance policies, user manuals, formal reports, and the like. The users were clerical staff—generally women—whose job was to transform hand written drafts into finished product. The software was built to support that business process and the process was reflected in the design and operation of the software. Functions and features of the software supported revising copy, enforcing formatting standards, and other requirements of a business.

The economics that drove the personal computer revolution changed the potential market for software. While Wang targeted organizations and word processing as an organizational function, software programmers could now target individual writers. This led to a proliferation of word processing tools in the 1980s and 1990s reflecting multiple models of the writing process. For example, should the process of creating a draft be separate from the process of laying out the text on the page? Should the instructions for laying out the text be embedded in the text of the document or stored separately? Is a long-form product such as a book a single computer file, a collection of multiple file, or something else?

Those decisions influence the writer’s process. If your process meshes with the programmer’s, then life is good. If they clash, the tool will get in the way of good work.

If you don’t recognize the issue, then your success or failure with a specific tool can feel capricious. If you select a tool without thinking about this fit, then you might blame yourself for problems and limitations that are caused by using a tool that clashes with your process.

Suppose we recognize that this issue of mental models exists. How do we take advantage of that perspective to become more effective in leveraging available tools? A starting point is to reflect on your existing work practices and look for the models you may be using. Are there patterns in the way you approach the work? Do you start a writing project by collecting a group of interesting examples? Do you develop an explicit hypothesis and search out matching evidence? Do you dive into a period of research into what others have written and look for holes?

Can you draw a picture of your process? Identify the assumptions driving your process? Map your software tools against your process?

These are the kinds of questions that designers ask and answer about organizational processes. When we work inside organizations, we accept the processes as a given. In today’s environment for knowledge work, we have the capacity to operate effectively at a level that can match what organizations managed not that long ago. Given that level of potential productivity and effectiveness, we now need to apply the same level of explicit thought and design to our personal work practices.

Design projects before worrying about managing them

 I’m gearing up to teach project management again over the summer. There’s a notion lurking in the back of my mind that warrants development.

We would do a better job at managing projects if we spent more time designing them first.

We turn something into a project when we encounter a problem that we haven’t tackled before and can’t see exactly how to solve in one step. The mistake we make is to spend too much time thinking about the word “management” at the expense of the word “project.”

Suppose our problem is to improve the way we bring new hires on board in a rapidly growing organization. There’s an obligatory amount of HR and administrative paperwork to complete, but the goal is to integrate new people into the organization. Do you send them off immediately to new assignments? Do you design a formal training program? There’s a host of questions and a host of possible ideas so you now have a potential project.

Given the pressures in organization to “get on with it” the temptation is to grab a half-formed idea, sketch a a plausible plan, guess at a budget, pick a deadline, and start. If we have some project management practices and discipline, we’ll flesh out the plan and develop something that looks like a work breakdown structure and a timeline.

What we don’t give sufficient attention to is the design thinking that might transform this project idea into something that might add meaningful and unexpected value to the organization. We don’t give that attention because we don’t view a project as something worthy of design.

What would it mean to design a project? Two ideas come to mind. The first would be to generate ways to increase the value of the effort. In our on-boarding example, one objective was to offer new employees practice in preparing client presentations. Rather than develop a generic presentation skills module, the project team came up with a better idea. We asked teams in the field for research and competitive intelligence tasks on their to do lists and transformed those into assignments for the on-boarding program. This allowed us to accomplish the presentation skills training, contribute to actual client work, and build bridges between new employees and their future colleagues.

A second design thinking aspect would be a more deliberate focus on generating multiple concepts and approaches that would evolve into a work breakdown structure. My conjecture is that more projects are closer to one-off efforts than to executing the nth iteration of a programming or consulting project. Project management personalities are tempted to standardize and control prematurely. But standard work plans and templates aren’t relevant until there are enough projects of a certain category to warrant the investment. Doing this kind of design work calls for expanding the toolset beyond the conventional project management toolkit that is focused on tracking and monitoring the sequence of tasks.

This is a notion in development. It isn’t fully baked but my instinct is that it is worth fleshing out. Some questions I’m curious to get reactions to include:

  • Is making a distinction between project design and project execution worth the trouble?
  • How do standard project management tools—Gantt charts, project management software such as Microsoft Project or Primavera, or spreadsheets—interfere with early stage idea generation and creativity?
  • How might you/do you expand the toolset to support better design thinking?

Review – Scrum: The Art of Doing Twice the Work in Half the Time

 Scrum: The Art of Doing Twice the Work in Half the Time. Jeff Sutherland and JJ Sutherland

There’s a rule of thumb in software development circles that the best programmers can be ten times as productive as average programmers. This is the underlying argument for why organizations seek to find and hire the best people. There is research to support this disparity in productivity. There is similar research on the relative productivity of teams. There, the range in productivity between the best and the rest is closer to two orders of magnitude; as much as a 1,000 times more productive.

Jeff Sutherland makes the case that the practices that collectively make up “Scrum” are one strategy for realizing those kinds of payoffs.

Sutherland is a former fighter pilot, a software developer, one of the original signatories of the Agile Manifesto, and the inventor of Scrum. This is his story of how Scrum came to be, what it is, and why it’s worthwhile. The particular value of the book is its focus on why Scrum is designed the way it is and why that matters.

Scrum’s origins and primary applications have been in the realm of software development. Sutherland builds an argument that Scrum’s principles and methods apply more broadly. Although he doesn’t make this argument directly, this wider applicability flows from evolutionary changes in the the organizational environment. Changes in the pace and complexity of organizational work simultaneously make conventional approaches less effective and Scrum more so.

Scrum is a collection of simple ideas; it’s a point of view about effective problem solving more than a formalized methodology. That’s important to keep in mind because like too many solid ideas, the essence can get lost in the broader rush to capitalize on those ideas. There appear to be an unlimited supply of training courses, consultants, and the usual paraphernalia of a trendy business idea; you’re better off spending time reading and thinking about what Sutherland has to say first. That may be all you need if you’re then willing to make the effort to put those ideas into practice.

Sutherland traces the roots of Scrum to the thinking underlying the Toyota Production System. He also draws interesting links to John Boyd’s work on strategy embodied in the OODA Loop and to the martial arts. Scrum is built on shortening the feedback between plans and action. It is a systematic way of feeling your way forward and adapting to the terrain as you travel over it.

Sutherland draws a sharp contrast with more traditional management techniques such as Waterfall project management approaches and their well-worn trappings such as Gantt charts and voluminous unread and unreadable requirements documents.

Understanding the managerial appeal and limitations of these trappings is key to grasping the contrasting benefits of Scrum. Waterfalls and Gantt charts appeal to managers because they promise certainty and control. They can’t deliver on that promise in today’s environment. In the software development world, they never could and in today’s general organizational environment they also come up short.

Understanding that appeal and why it is misplaced clarifies the strengths of Scrum. There was a time when managers came from the ranks of the managed. They had done the work they were now responsible for overseeing and were, therefore, qualified to provide the direction and feedback needed to pick a path and follow it. Management was primarily about execution and not about innovation.

The illusion in waterfall and other planning exercises is that what we are doing next is a repeat of what we have done before. If we have built 100 houses, we can be confident of what it will take to build the 101st. If we are building a new road or a new bridge, then what we have learned from the previous roads and bridges we’ve built can provide a fairly precise estimate for the next.

This breaks down, however, when we are building in new terrain or experimenting with new designs. The insights and experience of those who’ve built in the past don’t transfer cleanly to this more dynamic environment. The world of software has always been new territory and we are always experimenting. The terrain is always in flux even when the technology is temporarily stable. Now, it is those who are doing the work who are best positioned to plan and manage as we move into new territories and terrain.

Scrum comes into play when we are moving into territory where there are no roads and are no maps. If you are moving into new territory you can only plan as far ahead as you can see. There are no maps to follow. Sutherland puts it thus:

Scrum embraces uncertainty and creativity. It places a structure around the learning process, enabling teams to assess both what they’ve created and, just as important, how they created it. The Scrum framework harnesses how teams actually work and gives them the tools to self-organize and rapidly improve both speed and quality of work.

There’s a terminology and a set of techniques that make up Scrum. Sutherland covers the basics of such notions as scrum masters, product owners, backlog, sprints, retrospectives, communication saturation, continuous improvement, and stand up meetings. But he’s no fan of turning these into dogma.

Scrum runs the risk of being viewed as no more than the latest management fad. Sutherland is a true believer and has evidence to support his belief. There are lots of true believers but only a few are willing to bring substantive evidence to back up that belief. That earns Sutherland the right to offer his own closing argument:

What Scrum does is alter the very way you think about time. After engaging for a while in Sprints and Stand-ups, you stop seeing time as a linear arrow into the future but, rather, as something that is fundamentally cyclical. Each Sprint is an opportunity to do something totally new; each day, a chance to improve. Scrum encourages a holistic worldview. The person who commits to it will value each moment as a returning cycle of breath and life.

The heart of Scrum is rhythm. Rhythm is deeply important to human beings. Its beat is heard in the thrumming of our blood and rooted in some of the deepest recesses of our brains. We’re pattern seekers, driven to seek out rhythm in all aspects of our lives.

What Scrum does is create a different kind of pattern. It accepts that we’re habit-driven creatures, seekers of rhythm, somewhat predictable, but also somewhat magical and capable of greatness.

When I created Scrum, I thought, What if I can take human patterns and make them positive rather than negative? What if I can design a virtuous, self-reinforcing cycle that encourages the best parts of ourselves and diminishes the worst? In giving Scrum a daily and weekly rhythm, I guess what I was striving for was to offer people the chance to like the person they see in the mirror.

The Siren Call of Proven Systems

Seductive SirensOur admiration for the assembly line is so deep that we are suckers for the promise of “proven systems” regardless of their feasibility. We so treasure predictability and control that the promise seduces no matter how many times it is broken.

I started my career at what ultimately morphed into Accenture. That tenure coincided with the development of one of the early systems development methodologies—Method/1. The methodology was an attempt to make the process of understanding, designing, and implementing a technology solution to an organization’s business problem something that was manageable and repeatable. Method/1 made Accenture a lot of money and can be seen as a distant ancestor of a host of systems for building systems; agile, scrum, rational unified, waterfall, extreme programming—the ingenuity of marketers in packaging and labeling ideas is endless.

All are variations on a theme. They embody a desire for control in a turbulent world. If Ford could manufacture identical cars and McDonald’s could guarantee the consistency of fries and shakes from coast to coast, then we ought to be able to turn out information systems with similar confidence in quality and predictability.

There seems to be an uptick in promises of proven systems in multiple settings; not simply in the arena of design and development. Given their appeal, it’s critical to recognize the limits of these promises.

The first limit is the dangerously thin data from which these proven systems are built. One of the first ideas pounded into your head when you are compelled to study statistics and research methods is that “data” is not the plural form of “anecdote.” Yet most proven systems are based on a handful of prior examples that happened to work.

Look at the history of Method/1. The late accounting firm Arthur Andersen & Co.—which birthed Accenture—automated the payroll department of a GE manufacturing plant in the 1960s. That first project failed and Andersen rebuilt the system at a loss to make good. Andersen’s sales pitch to their second client boiled down to we’ve learned what mistakes to avoid while our competitors have yet to make them. Several projects later, Andersen consultants documented what they had done and packaged their ad hoc approach into a standard project plan for internal use called the “client binders.” As an accounting firm, Andersen was accustomed to protecting client identities and to thinking of audit processes as something common and repeatable for all clients. Consequently, the client binders were devoid of client specifics and mirrored the look and feel of standardized, repeatable audit processes.

In spite of their limitations and the thin data they were built on, the client binders gave Andersen’s consultants a better starting point for new projects and helped Andersen extend and consolidate their lead in the market. They worked best in the hands of experienced consultants who could use these materials to organize and support productive conversations with clients and prospects about how to structure and manage new efforts.

Then the methodology zealots took over. The experts took the crib notes that were valuable to experts and rewrote them into recipes for reasonably smart people with limited experience to follow. Repeatable, industrial, processes promise good economics to those who invent them and acceptable quality to those who seek the outputs.

The fundamental problem is that today’s knowledge work doesn’t consist of repeatable, industrial, processes no matter how much we wish they did or how often we claim they do. I’ve written before about the problems this strategy presents (Repeatable Processes and Magic Boxes).

Where does that leave us?

Be suspicious of claims about proven systems. Look for the demands for creative leaps and flashes of insight hiding within seemingly innocuous steps. Look for potentially endless cycles of analysis with no stopping rule. Find the magic box. Be wary of maps that don’t show where the roads end and the dragons hide.

Effective Executives Are Design Thinkers

DruckerEffectiveExecThe Effective Executive: The Definitive Guide to Getting the Right Things Done (Harperbusiness Essentials) (Peter F. Drucker and Jim Collins)

There are certain smart, articulate, thinkers that provide anchors for my thinking; Peter Drucker is high on that list. Somewhere around the time I was figuring out that organizations caught and held my attention, I also stumbled upon Drucker.

Not too long ago, I added a copy of The Effective Executive to my Kindle and chose to revisit Drucker’s observations about effectiveness. I’ve been uncomfortable about the surge in attention around efficiency and productivity of knowledge workers and thought Drucker might have relevant insight.

He does.

Drucker was credited with coining the term knowledge worker and the bulk of his work focused on why they mattered and what to do about it. The Effective Executive was originally published in 1966 and was reissued in a 50th anniversary edition. It’s a bit scary to contemplate the level of insight available for the taking.

Organizations consist of two kinds of people; those who follow scripts and those who write them. For a long time—and certainly in 1966—the overwhelming majority of headcount fell into the first category. Drucker’s attention was always drawn to those who wrote the scripts—executives. What’s changed since then is that scripts have become the responsibility of software. Organizations are gradually—and not so gradually—eliminating people who follow scripts.

That makes Drucker’s observations and advice about the script writers orders of magnitude more important than it was then. Drucker argued that “working on the right things is what makes knowledge work effective.” Doing the right thing is more important than doing things right, regardless of how much of the visible activity in organizations seems to be about doing things right. The Effective Executive is Drucker’s extended examination of how to systematically go about doing the right thing.

Drucker’s worldview was not tainted by today’s slavish devotion to shareholder value. In Drucker’s world, organizations exist to create and serve customers; there is no possibility of value creation, much less maximization, until a customer exists. And customers exist outside the organization. This creates a fundamental challenge for executives, which Drucker characterizes as follows:

Every executive, whether his organization is a business or a research laboratory, a government agency, a large university, or the air force, sees the inside—the organization—as close and immediate reality. He sees the outside only through thick and distorting lenses, if at all. What goes on outside is usually not even known firsthand. It is received through an organizational filter of reports, that is, in an already predigested and highly abstract form that imposes organizational criteria of relevance on the outside reality.

He goes on to say that

The fundamental problem is the reality around the executive. Unless he changes it by deliberate action, the flow of events will determine what he is concerned with and what he does.

Regardless of the changing dynamics of organization and environment, executive effectiveness consists of judgments whether a particular situation fits within the parameters of existing organizational scripts, will yield to one-time improvisation, or calls for developing a new script.

What this implies, and what Drucker argues, is that

effective decision is always a judgment based on “dissenting opinions” rather than on “consensus on the facts.” And [effective executives] know that to make many decisions fast means to make the wrong decisions. What is needed are few, but fundamental, decisions. What is needed is the right strategy rather than razzle-dazzle tactics.

Several things flow from that. Most importantly, effective decisions require meaningful chunks of time. Here, Drucker’s thinking connects closely with Cal Newport’s more recent discussion of deep work.

As is so often the case with Drucker, his insights get picked up and repackaged. Drucker’s analysis of effectiveness means that executives are engaged in design thinking. More importantly, it is design thinking that strives to synthesize the analysis and insights of multiple specialized perspectives.

There is much more in this short book. It bears close reading and regular re-reading if you aspire to do meaningful executive work. To wrap this up, I want to examine one aspect of this design thinking perspective that appears to run counter to rhetoric of many strategy efforts and many consulting proposals and reports. Let’s look at Drucker’s own words once again,

To get the facts first is impossible. There are no facts unless one has a criterion of relevance. Events by themselves are not facts.

The only rigorous method, the only one that enables us to test an opinion against reality, is based on the clear recognition that opinions come first—and that this is the way it should be. Then no one can fail to see that we start out with untested hypotheses—in decision-making as in science the only starting point. We know what to do with hypotheses—one does not argue them; one tests them. One finds out which hypotheses are tenable, and therefore worthy of serious consideration, and which are eliminated by the first test against observable experience.

The effective decision-maker, therefore, organizes disagreement. This protects him against being taken in by the plausible but false or incomplete. It gives him the alternatives so that he can choose and make a decision, but also so that he is not lost in the fog when his decision proves deficient or wrong in execution. And it forces the imagination

If finding the time and the money to pursue an MBA is currently difficult, consider adding The Effective Executive to your reading stack and put it near the top.

Don’t connect the dots; solve for pattern

Last time, we looked at the limits of a rhetorical strategy (”connecting the dots”) as a tool for planning. There are too many dots forming too many possible pictures, yet we want to take advantage of what we know has gone before to set a way forward. There is a better way that resides in “solving for pattern” a phrase I first encountered in an essay by Wendell Berry.

During my first weeks of business school, marketing mystified me. I had a decent background in accounting and project management; those classes I could make sense of. In marketing, I was lost during the discussion. When the professor laid out the analysis at the end I was no more enlightened than I had been eighty minutes earlier.

I concluded that the fatal flaw of the case method was its failure to provide neat pictures I could lay over the dots scattered throughout the case. Shortly before the midterm exam, a second-year student offered a review session with a roadmap for how to make sense of any marketing problem. I now had the picture that I could lay over the dots and connect them myself. The faculty was very distressed by this review session. We thought it was because their secrets had been revealed.

Perhaps.

I still made a mess of the midterm.

It was much later, after I worked as a case writer, when I began to grasp that the faculty was pursuing a deeper agenda. The case method is not about learning how to connect the dots; it’s about learning how to solve for pattern. That second-year student’s crib sheet was no help for dealing with a supply chain problem masquerading as a marketing problem.

Elements of Solving for Pattern

Organizations are stuffed with people ready to solve problems once they’ve been labeled. What is rare—and more valued—is the ability to frame and structure problems so that they can be solved. There are no simple pictures hiding in the environment waiting to be revealed and pointing to pre-determined responses. The proliferation of dots are signals in the environment worth attending to; ignoring them is not wise. If we’re not looking for dots to connect and trigger responses, what are we to do? Solving for pattern is a problem framing and response design process that better fits the world we occupy.

Wendell Berry introduced the term in an effort to contrast industrial farming strategies with family farming strategies. The strength of the strategies Berry advocated came not from swapping corporate managers for family ownership but from treating farms as complex systems rather than faux factories. As a trivial example, Berry points to replacing chemical fertilizers on crops with the manure produced by the milk cows feeding off those same crops. Connecting those two isolated systems into a larger dynamic system produces better outputs, reduces costs, and has other positive effects that ripple through other components of the overall farm system.

Much of solving for pattern is a change of perspective. It starts by viewing the current situation as a dynamic equilibrium producing outcomes that are a mix of desirable and undesirable results. The goal is not to fix a component in a bigger machine but to push the system into a new stable equilibrium with a better balance of desirable and undesirable outcomes.

This strategy of solving for pattern melds pattern recognition, systems thinking, design skill, and change management practices to effect these rebalancing efforts. Pattern recognition and systems thinking are the tools we need to understand where we are and the trajectory we are on. That understanding feeds into the design and change management efforts needed to envision and shift from the pattern we have to the pattern we want.

In his essay, Berry was exploring the complexities of managing the miniature ecosystem of a working farm. Whether you call them problems, opportunities, or messes, the situations that arise on farms —and organizations—never come with labels or instructions attached. Nor are they neatly isolated from whatever else may be going on. Tackling those situations effectively starts with choosing a better point of view.

The limits on connecting the dots

connect the dots exampleI was a big fan of Sherlock Holmes and various other fictional detectives growing up; I’m still drawn to the form when I want to relax. There’s a basic pleasure in trying to match wits with Sherlock and figure our who the killer is before the final reveal. Connecting the dots is a rewarding game but it’s a flawed strategy.

The picture is always clear after the fact. But to jump from that fact to the assumption that you can recognize the dots earlier and connect them into one single, compelling, inarguable picture that triggers right action is a leap too far. Detective stories have an advantage that truth lacks; the writer knows where they intend to go.

The problem is that connecting the dots is such a well worn story telling and story organizing technique. The key to using this as a story telling technique is to manage the ratio of story dots and noise dots. Looking backward to make sense of the outcome that came to pass, the storyteller emphasizes the dots that reveal the picture and shares just enough other dots to suggest the work our hero had to do to see the hidden picture. As a good storyteller you might add a clever twist or put our hero in just the right spot to see what others have missed.

Suppose, however, that in the real world our potential hero was looking in the wrong direction or trying to separate the relevant dot from a thousand irrelevant but similar dots. Now our hero might be portrayed as a fool or an incompetent. Our hero’s reputation and future hangs on the objectives of the storyteller, not on the events in the field.

The appeal of the connecting the dots rhetorical strategy obscures its flaws as a thinking strategy. There are two principal flaws from a problem-solving perspective. One, it suggests there is a single, correct, picture of what is going on waiting to be discovered as soon as someone draws the correct thread through the data points that otherwise appear to be noise. Two, it presumes that recognizing the picture offers sufficient, timely, insight into appropriate action steps.

There are always too many dots; in a big data world we seek more dots in the hope that the picture will emerge. All the analytical tools will do, however, is add more possible pictures that might be hiding in the dots. Any collection of dots might form multiple pictures; more dots simply suggest more pictures.
Which brings us to the second problem; how does recognizing a picture connect to effective response? In our detective stories, our goal is to bring the killer to justice, perhaps to intervene before the killer strikes again.

In the world of more dots than time, our goal is to devise a response soon enough to make a difference. We wish to create a new story to be told. The dots behind us are only useful in suggesting where to place the dots that are not yet part of the picture we are in the act of creating.

Practical Action, Volatile World

HBS classI’ve lost the certificate but I was once honored by my classmates for the “best bluff when called on unprepared” in a managerial decision analysis class during my first year in business school. I’m fuzzy about what had happened the night before as were most of my classmates so some celebration had taken precedence over opening the day’s case, much less actually reading it. Anyway, it’s now 8:30 and Professor F opens by saying “Mr. McGee, could you please take the role of John Smith for today.” Professor F then proceeded to name two other victims and paired each of us with classmates who were lawyers. We were dispatched to the hallway, while Professor F briefed the rest of the class on what what going to happen next. Out in the hall, I turned to my new partner, Jay, and asked him the key question on my mind. “Who is John Smith?” Jay’s response was “you’re the CEO of Acme—keep your mouth shut and don’t do or say anything unless I tell you to.” This was advice I was qualified to handle.

We returned to the class and played out the scenario laid out in the case I hadn’t opened. I followed Jay’s advice and we escaped unscathed until the debrief. Bruce, one of our classmates who had been at the same celebration and knew my level of non-preparation, asked Professor F how he had selected his victims for the day. His answer was, “picking the lawyers was obvious, of course; I then chose Mr. McGee since I knew he would have already cracked the case and, therefore, would follow the correct course of action.” This provoked the reaction you might expect and I was forced to confess.

The obvious lesson is do your homework and surround yourself with people you can rely on. For many years that was all I took away and it wasn’t bad advice. But it’s proven to be incomplete and the second order lessons have become more important. What’s interesting is the changing relationship between the lessons of experience and the challenges thrown up from an environment that is more volatile, uncertain, complex, and ambiguous (VUCA if you are in to trendy shorthand) than what was contemplated in that classroom several decades back.

A manager’s job is to use of what Tom Peters calls “a bias for action,” and move in some direction. Analysis and experience influence the direction chosen. Movement generates new signals about how the environment is reacting and you keep moving, possibly in a slightly different direction. I’ve written before about Donald Schon and the role of “reflective practice.” In Schon’s formulation, effective managers engage in a continuing process of formulating hypotheses, acting, and adjusting actions based on outcomes. The value of cumulative experience lies in making better choices about the direction of movement.

In familiar territory and established markets, action choices are clear. You only need or want enough analysis to distinguish between available action choices. In new markets and developing territory, action choices are less clear and possibly opaque. Action and understanding have to coevolve. There is never enough time to prepare; the world happens at its own rate and managers must act ready or not.

The value of experience changes in this world. Experience gave managers an inventory of familiar situations and responses that worked once upon a time. In this environment, situations don’t map to those we’ve seen before. Managers can’t simply select from a menu of responses that are known to work; they must design new responses fit to new realities.

Learning is harder in the digital world

Snowboard lesson Most of us have crappy theories of learning. The better you were at school the more likely your theories about learning are distorted. I ran into this phenomenon while I was the Chief Learning Officer at Diamond Technology Partners in the 1990s. My partners were full of well intentioned advice about how they thought I should do my job based on their school experiences years or decades earlier in their lives. I had my own, somewhat less ill-informed, theories based on my more recent school experiences convincing my thesis committee to let me loose on the world.

Fortunately, I also made the smart decision to go find several people smarter than I was and hung around with them long enough to soak up some useful insight. Two in particular, Alan Kay and Roger Schank, were instrumental in shaking me free from my poor theories. Very different in temperament, they did agree on fundamental insights about how learning worked.

Learning is what happens when our expectations about the world collide with experience. As we adjust our expectations to be better aligned with reality we learn.

Schools are dangerous places for learning because they are too isolated from the real world. On the other hand, the real world can be straight up dangerous if we haven’t learned how to behave correctly in the situation at hand. All learning is learning by doing, whether we’re learning to turn on a snowboard or solve a differential equation. If we had unlimited time and were invulnerable, we could figure anything out on our own. As it is, it helps to have someone who knows more than we do to arrange the experiences we can learn from in a reasonable and safe sequence.

The name for this strategy is “apprenticeship” and remains the most effective from a learner-centered perspective. All other approaches are compromises to make the economics work or to solve scale mismatches between the number of those needing to learn and those with mastery to pass along. Anthropologist Lucy Suchman showed how to extend this notion of apprenticeship to all kinds of learning beyond the trade/craft connotations we attach to the word. She talked of learning and apprenticeship as a process of “legitimate peripheral participation.” You learned how to repair copiers by handing tools to the senior repair technician and carrying their bags. You learned how to handle the cash register by watching someone who already had it figured out. You learned how to put a budget together by doing the junior-level scut work of helping your boss transform a handwritten budget into a typewritten one.

It’s become a cliche that learning has become an ongoing requirement in all kinds of work. The problem isn’t simply that work demands more learning more often. The changing nature of work also makes learning qualitatively harder as well. This was never a problem for physical work and for much of the knowledge work of the 20th century. Nearly everything you might want to observe was in sight. You could watch how a repair technician selected and handled tools. You could see an editor’s corrections and notes in the margins of your manuscript.

As work has evolved to be more abstract and more mediated by technology, the task of learning has gotten harder. Whether we call it apprenticeship or legitimate peripheral participation it becomes difficult, if not impossible, in environments where you can’t see what others are doing. Previously, the learning called for within organizations occurred as a byproduct of doing work. It now takes conscious and deliberate effort to design work so that it is, in fact, learnable.