Don’t connect the dots; solve for pattern

Last time, we looked at the limits of a rhetorical strategy (”connecting the dots”) as a tool for planning. There are too many dots forming too many possible pictures, yet we want to take advantage of what we know has gone before to set a way forward. There is a better way that resides in “solving for pattern” a phrase I first encountered in an essay by Wendell Berry.

During my first weeks of business school, marketing mystified me. I had a decent background in accounting and project management; those classes I could make sense of. In marketing, I was lost during the discussion. When the professor laid out the analysis at the end I was no more enlightened than I had been eighty minutes earlier.

I concluded that the fatal flaw of the case method was its failure to provide neat pictures I could lay over the dots scattered throughout the case. Shortly before the midterm exam, a second-year student offered a review session with a roadmap for how to make sense of any marketing problem. I now had the picture that I could lay over the dots and connect them myself. The faculty was very distressed by this review session. We thought it was because their secrets had been revealed.

Perhaps.

I still made a mess of the midterm.

It was much later, after I worked as a case writer, when I began to grasp that the faculty was pursuing a deeper agenda. The case method is not about learning how to connect the dots; it’s about learning how to solve for pattern. That second-year student’s crib sheet was no help for dealing with a supply chain problem masquerading as a marketing problem.

Elements of Solving for Pattern

Organizations are stuffed with people ready to solve problems once they’ve been labeled. What is rare—and more valued—is the ability to frame and structure problems so that they can be solved. There are no simple pictures hiding in the environment waiting to be revealed and pointing to pre-determined responses. The proliferation of dots are signals in the environment worth attending to; ignoring them is not wise. If we’re not looking for dots to connect and trigger responses, what are we to do? Solving for pattern is a problem framing and response design process that better fits the world we occupy.

Wendell Berry introduced the term in an effort to contrast industrial farming strategies with family farming strategies. The strength of the strategies Berry advocated came not from swapping corporate managers for family ownership but from treating farms as complex systems rather than faux factories. As a trivial example, Berry points to replacing chemical fertilizers on crops with the manure produced by the milk cows feeding off those same crops. Connecting those two isolated systems into a larger dynamic system produces better outputs, reduces costs, and has other positive effects that ripple through other components of the overall farm system.

Much of solving for pattern is a change of perspective. It starts by viewing the current situation as a dynamic equilibrium producing outcomes that are a mix of desirable and undesirable results. The goal is not to fix a component in a bigger machine but to push the system into a new stable equilibrium with a better balance of desirable and undesirable outcomes.

This strategy of solving for pattern melds pattern recognition, systems thinking, design skill, and change management practices to effect these rebalancing efforts. Pattern recognition and systems thinking are the tools we need to understand where we are and the trajectory we are on. That understanding feeds into the design and change management efforts needed to envision and shift from the pattern we have to the pattern we want.

In his essay, Berry was exploring the complexities of managing the miniature ecosystem of a working farm. Whether you call them problems, opportunities, or messes, the situations that arise on farms —and organizations—never come with labels or instructions attached. Nor are they neatly isolated from whatever else may be going on. Tackling those situations effectively starts with choosing a better point of view.

The limits on connecting the dots

connect the dots exampleI was a big fan of Sherlock Holmes and various other fictional detectives growing up; I’m still drawn to the form when I want to relax. There’s a basic pleasure in trying to match wits with Sherlock and figure our who the killer is before the final reveal. Connecting the dots is a rewarding game but it’s a flawed strategy.

The picture is always clear after the fact. But to jump from that fact to the assumption that you can recognize the dots earlier and connect them into one single, compelling, inarguable picture that triggers right action is a leap too far. Detective stories have an advantage that truth lacks; the writer knows where they intend to go.

The problem is that connecting the dots is such a well worn story telling and story organizing technique. The key to using this as a story telling technique is to manage the ratio of story dots and noise dots. Looking backward to make sense of the outcome that came to pass, the storyteller emphasizes the dots that reveal the picture and shares just enough other dots to suggest the work our hero had to do to see the hidden picture. As a good storyteller you might add a clever twist or put our hero in just the right spot to see what others have missed.

Suppose, however, that in the real world our potential hero was looking in the wrong direction or trying to separate the relevant dot from a thousand irrelevant but similar dots. Now our hero might be portrayed as a fool or an incompetent. Our hero’s reputation and future hangs on the objectives of the storyteller, not on the events in the field.

The appeal of the connecting the dots rhetorical strategy obscures its flaws as a thinking strategy. There are two principal flaws from a problem-solving perspective. One, it suggests there is a single, correct, picture of what is going on waiting to be discovered as soon as someone draws the correct thread through the data points that otherwise appear to be noise. Two, it presumes that recognizing the picture offers sufficient, timely, insight into appropriate action steps.

There are always too many dots; in a big data world we seek more dots in the hope that the picture will emerge. All the analytical tools will do, however, is add more possible pictures that might be hiding in the dots. Any collection of dots might form multiple pictures; more dots simply suggest more pictures.
Which brings us to the second problem; how does recognizing a picture connect to effective response? In our detective stories, our goal is to bring the killer to justice, perhaps to intervene before the killer strikes again.

In the world of more dots than time, our goal is to devise a response soon enough to make a difference. We wish to create a new story to be told. The dots behind us are only useful in suggesting where to place the dots that are not yet part of the picture we are in the act of creating.

Practical Action, Volatile World

HBS classI’ve lost the certificate but I was once honored by my classmates for the “best bluff when called on unprepared” in a managerial decision analysis class during my first year in business school. I’m fuzzy about what had happened the night before as were most of my classmates so some celebration had taken precedence over opening the day’s case, much less actually reading it. Anyway, it’s now 8:30 and Professor F opens by saying “Mr. McGee, could you please take the role of John Smith for today.” Professor F then proceeded to name two other victims and paired each of us with classmates who were lawyers. We were dispatched to the hallway, while Professor F briefed the rest of the class on what what going to happen next. Out in the hall, I turned to my new partner, Jay, and asked him the key question on my mind. “Who is John Smith?” Jay’s response was “you’re the CEO of Acme—keep your mouth shut and don’t do or say anything unless I tell you to.” This was advice I was qualified to handle.

We returned to the class and played out the scenario laid out in the case I hadn’t opened. I followed Jay’s advice and we escaped unscathed until the debrief. Bruce, one of our classmates who had been at the same celebration and knew my level of non-preparation, asked Professor F how he had selected his victims for the day. His answer was, “picking the lawyers was obvious, of course; I then chose Mr. McGee since I knew he would have already cracked the case and, therefore, would follow the correct course of action.” This provoked the reaction you might expect and I was forced to confess.

The obvious lesson is do your homework and surround yourself with people you can rely on. For many years that was all I took away and it wasn’t bad advice. But it’s proven to be incomplete and the second order lessons have become more important. What’s interesting is the changing relationship between the lessons of experience and the challenges thrown up from an environment that is more volatile, uncertain, complex, and ambiguous (VUCA if you are in to trendy shorthand) than what was contemplated in that classroom several decades back.

A manager’s job is to use of what Tom Peters calls “a bias for action,” and move in some direction. Analysis and experience influence the direction chosen. Movement generates new signals about how the environment is reacting and you keep moving, possibly in a slightly different direction. I’ve written before about Donald Schon and the role of “reflective practice.” In Schon’s formulation, effective managers engage in a continuing process of formulating hypotheses, acting, and adjusting actions based on outcomes. The value of cumulative experience lies in making better choices about the direction of movement.

In familiar territory and established markets, action choices are clear. You only need or want enough analysis to distinguish between available action choices. In new markets and developing territory, action choices are less clear and possibly opaque. Action and understanding have to coevolve. There is never enough time to prepare; the world happens at its own rate and managers must act ready or not.

The value of experience changes in this world. Experience gave managers an inventory of familiar situations and responses that worked once upon a time. In this environment, situations don’t map to those we’ve seen before. Managers can’t simply select from a menu of responses that are known to work; they must design new responses fit to new realities.

Learning is harder in the digital world

Snowboard lesson Most of us have crappy theories of learning. The better you were at school the more likely your theories about learning are distorted. I ran into this phenomenon while I was the Chief Learning Officer at Diamond Technology Partners in the 1990s. My partners were full of well intentioned advice about how they thought I should do my job based on their school experiences years or decades earlier in their lives. I had my own, somewhat less ill-informed, theories based on my more recent school experiences convincing my thesis committee to let me loose on the world.

Fortunately, I also made the smart decision to go find several people smarter than I was and hung around with them long enough to soak up some useful insight. Two in particular, Alan Kay and Roger Schank, were instrumental in shaking me free from my poor theories. Very different in temperament, they did agree on fundamental insights about how learning worked.

Learning is what happens when our expectations about the world collide with experience. As we adjust our expectations to be better aligned with reality we learn.

Schools are dangerous places for learning because they are too isolated from the real world. On the other hand, the real world can be straight up dangerous if we haven’t learned how to behave correctly in the situation at hand. All learning is learning by doing, whether we’re learning to turn on a snowboard or solve a differential equation. If we had unlimited time and were invulnerable, we could figure anything out on our own. As it is, it helps to have someone who knows more than we do to arrange the experiences we can learn from in a reasonable and safe sequence.

The name for this strategy is “apprenticeship” and remains the most effective from a learner-centered perspective. All other approaches are compromises to make the economics work or to solve scale mismatches between the number of those needing to learn and those with mastery to pass along. Anthropologist Lucy Suchman showed how to extend this notion of apprenticeship to all kinds of learning beyond the trade/craft connotations we attach to the word. She talked of learning and apprenticeship as a process of “legitimate peripheral participation.” You learned how to repair copiers by handing tools to the senior repair technician and carrying their bags. You learned how to handle the cash register by watching someone who already had it figured out. You learned how to put a budget together by doing the junior-level scut work of helping your boss transform a handwritten budget into a typewritten one.

It’s become a cliche that learning has become an ongoing requirement in all kinds of work. The problem isn’t simply that work demands more learning more often. The changing nature of work also makes learning qualitatively harder as well. This was never a problem for physical work and for much of the knowledge work of the 20th century. Nearly everything you might want to observe was in sight. You could watch how a repair technician selected and handled tools. You could see an editor’s corrections and notes in the margins of your manuscript.

As work has evolved to be more abstract and more mediated by technology, the task of learning has gotten harder. Whether we call it apprenticeship or legitimate peripheral participation it becomes difficult, if not impossible, in environments where you can’t see what others are doing. Previously, the learning called for within organizations occurred as a byproduct of doing work. It now takes conscious and deliberate effort to design work so that it is, in fact, learnable.

The danger of easy paths

easy pathBeing a quick study can get in the way of learning what you need to understand. Midway through elementary school I went on my first field trip to the Museum of Natural History in Milwaukee. The enduring memory from that trip was my first encounter with a buffet line in the museum cafeteria. My tray was loaded by the end of the line and I was mildly ill for the remainder of the day. This was merely the first in a long line of lessons about the difference between theory and practice; lessons that I continue to trip over half a century later. There was the 4th-grade teacher who sat me in the back of the classroom and challenged me to see how much of the World Book Encyclopedia I could finish by year’s end whenever my other work was done. By the time I got to college, I had been nudged by so many teachers and other supporters in the direction of thinking over other kinds of doing that I was on track to graduate in three years.

As that second year in college was drawing to an end, I realized that I had no idea what I wanted to do when graduation arrived. My parents didn’t flinch when I concluded that I wanted to take the full four years to finish school; I wasn’t insightful enough to grasp the financial hit I was imposing on them. Regardless, they bought me an extra year to work out an answer about a next step.

There’s a running debate whether competence or passion should drive your career. The passion wing cheers that heart should drive your choices; find your passion and the career will take care of itself. More recently, the counter-narrative that expertise precedes engagement has regained ground. This is the realm of the 10,000-hour rule and deliberate practice. Better that you should become good at something and trust competence to evolve into commitment.

We now have two false dichotomies—theory vs. practice and passion vs. competence. There are others we could add to the list. The world isn’t organized into these binary choices. It’s necessarily messy and complex. This is not a popular position; we all want to believe advice that begins with “all you need to do is…” Everyone is offering proven systems or guaranteed methodologies. Instead, we should seek to accept complexity without letting it paralyze us. We need to remember what H.L. Mencken said “there is always an easy solution to every human problem—neat, plausible, and wrong.”

Today, in particular, we live in a world full of bright, shiny, objects promising to address their narrow perspective on a narrow conception of the problem. We need to become adept at framing opportunities to incorporate messiness. That’s a process that benefits from traveling companions walking the same paths.

There’s always a bigger picture

Earth from spaceIt’s a given in writing circles that there are only a small number of basic plots; the miracle of human storytelling is how many unique tales have been wrought from that basic ore.

Storytelling has become a lens in better understanding the organizations that we belong to and interact with. I think we’ve construed story too narrowly in organizations. Oddly enough, we’ve done so by ignoring the social dimensions of story.

Our first images of story conjure Homer declaiming to the audience gathered by the fire. But we quickly enlisted others to bring stories to fuller life and drama arrived. Through the usual concatenation of circumstance, fortuitous moments, and basic temperament I found my way into that world. A degree of shyness and curiosity about how things—as opposed to people—worked led to a shadow career behind the scenes.

I was fortunate that my early history involved producing original works within a tradition rich environment. What that meant was that you started without a script or even a title but opening night was fixed. Some of those involved had been through the experience the year before and the year before that. Others of us had no clue. All of us knew that the institution had been pulling off the equivalent trick since 1891. There are things you can accomplish before the script is finished. On the other hand, it’s hard to build a set before you’ve designed it and you can’t design it without an inkling of what the show will be about.

From the outside, this looks like chaos. Parts of it are. Traditions and history assure you that the curtain will go up on opening night. Tradition also assures that you will lose sleep along the way. I couldn’t see it at the time but I was learning how innovation and organization worked. More specifically, I was learning about innovation as collaborative story-making and about the interplay linking technology constraints and opportunities with artistic or strategic vision.

Viewed from the audience, performance is about actors, their lines, and their interactions. Sets, lights, and costumes may be visible, may attract some fleeting attention, yet fade into the background. The massive efforts going on behind the scenes to create the context within which the performance plays out only become apparent when something malfunctions.

This creative fulcrum, where content and context are brought together, is the most interesting place to stand. Those who can learn to look in both directions—to play with both content and context—have more degrees of innovation freedom than those who limit their gaze. Learning to look both ways, however, is a very hard thing to do.

What makes it hard is that the tribes who come together to create a shared creative outcome come from very different traditions and mindsets. Pursuing excellence within any one aspect of production means settling for mediocrity in others. Working across tribes demands an ability to speak multiple languages with competence at the expense of eloquence in a single tongue.

The particular conversational bridge that draws my attention today is the contribution that changing technology can offer. Technology is a powerful creative lever . It makes things possible that weren’t before. In a stage production, for example, an automated lighting system can slowly change the look of the set over the course of many minutes, simulating the transition from dawn to full daylight. That is an artistic effect that wasn’t possible when the change depended on human operators.

But that creative effect might never be contemplated unless the artists staging and directing the performance learn about and understand what the technology has made possible. And that requires an effective conversation between artist and technologist.

It requires translators but also requires something more difficult. This level of creative collaboration demands a shared respect for all who contribute. Healthy, tradition rich institutions are better prepared to cope with new innovation opportunities than weaker organizations. Resistance to change is a marker of organizational weakness; infatuation with new technology for new technology’s sake is a marker of low trust in institutional tradition.

Managing the conversation across the performance and technology tribes depends on helping the participants see the larger picture that all are seeking to bring into existence.

Learning to think for innovation

Management Thinker Peter Drucker

“Innovation and change make inordinate time demands on the executive. All one can think and do in a short time is to think what one already knows and to do as one has always done.”
Peter Drucker – The Effective Executive

Once again, Peter Drucker offers insights for today even though this was originally written in 1967. Taken at face value it suggests that innovation in today’s organizations is a fool’s quest. A few people—Cal Newport, for example, in “Deep Work”—have begun to pick up on this, but they are distinctly in the minority. They can be nearly impossible to notice or hear in the cacophony of productivity advice, inboxes littered with offers to write your best-selling book in 90 days, and workshops to design your business model in a weekend. We have always been enamored with speed and live in an environment that raises speed to a religion. Thinking hard has rarely been popular.

There are some counterexamples. Bill Gates was famous for his “Think Weeks” where he withdrew from his day to day responsibilities to look beyond the immediate. The question for mere mortals is how to create the time and space needed to do the kind of thinking needed for innovation. Recognizing the challenge is, doubtless, the first step. Few of us have the luxury or clout of a Bill Gates to dedicate entire weeks for deep thought; we can all recognize that some kinds of thinking and reflection require bigger chunks of time and carve out those chunks where they can be found.

Another thing worth doing is to get better at stringing those chunks together in more effective ways. I’ve taken a run at this before, for example,

Distraction is the enemy of reflection. What Drucker is pointing out is that the underlying time demands for innovative thought flow from the demands of clearing your mind of the immediate and building the internal mental models necessary for deep work. I think of it as learning to meditate in a particular direction. We know a good bit about step one from the lessons of meditation. But meditation tends to stop there. The second step is to point your thinking in a particular direction and provide useful supplies for the journey. There is a process and discipline to thinking about innovation that is learnable. It is a skill that can be improved with deliberate practice.

Sweating details

Bottom Line Program CoverDuring my second year in business school, I co-produced the annual student variety show. My co-producer and I ended up dipping into our own shallow wallets to cover budget overruns. To the best of my knowledge, it was the first time the show produced an artistic success and a financial failure. That was a decidedly un-business school result. Although we named the show “A Bottom Line” in homage to “A Chorus Line,” we lost sight of our bottom line in pursuit of our vision.

This was probably the harshest lesson I took away from theater as laboratory. Art and economics do not coexist without effort. Balancing them demands vision and craft. I can’t prove that pursuing one over the other leads to failure, but success always seems to correlate with making the partnership effective.

I wrestle with why this simple observation should seem so hard to grasp. Perhaps, those who cling to either pole—art or economics—secretly want to have a ready excuse for mediocrity. How much easier it is to blame the “suits” for paying more attention to the budget than to the audience. Or to blame the director for wasting that budget on sets that no one cared about.

The tension between art and economics is no new thing. The lesson I learned over time was the more nuanced relation between vision and detail. Any new product or service depends on a tapestry of interlocking details that each contribute to the overall experience. Too often, it is the interweaving that gets forgotten. As organizations strive to fit new ideas and processes into the existing design, crucial details get sanded off or forgotten in the gaps between jurisdictions. In last year’s Academy Awards ceremony, the presenters on stage announced the wrong winner for Best Picture. The error traced back to a momentary lapse offstage in managing the envelopes because there were two sets of envelopes to deal with the detail of not being certain where the presenters would enter from. A huge gaffe in front of the world traces back to a design detail that missed one possible failure mode.

There’s a collection of useful lessons to be gleaned from a detailed post-mortem of this particular mix up. The broader lesson is that no one looked at how little changes in the details of a long-running process put the broad vision at risk. The lesson from my theater experience is that there is always a level of chaos playing out offstage that is essential to maintaining the illusion of control on stage. If you don’t recognize this and factor it into your designs and your management practices, then the offstage chaos is going to spill into view.

Bridging the techie divide

Pocket ProtectorI struggle with how to respond to smart people who preface their questions with “I’m not a techie…” In my smartass days—which I hope are largely behind me—I would have said something snide or rolled my eyes or possibly both. Since I’ve been married to one of these people for the last 34 years, suppressing that immediate response and looking deeper is a wiser strategy.

We all get that technology suffuses our days. It is a central element of our environment. But we are not as blessed as Marshall McLuhan’s fish. McLuhan once quipped that “I don’t know who discovered water but it wasn’t a fish.” We can’t simply swim in our technology environment and remain oblivious. Our technology environment intrudes. We see something and it feels threatening. That sense of dread limits our ability to navigate smoothly and comfortably.

For those who can see more clearly, it can be hard to parse the feelings of those who cannot see as clearly. To complicate matters further, knowledge of and comfort with the technological environment doesn’t eliminate the sense of dread—it merely relocates it.

There is a pair of questions to be explored to discover where we might go.

  • What is it about the approach to technical challenges of non-techies that contributes to their distress and struggles?
  • What is it that “techies” do differently that allows them to navigate a dynamic technology environment with comfort?

I think I’ve noticed two things about how smart non-techies approach technology. First, in an odd way, they are too procedural. They seek a precise, step-by-step, recipe for how to accomplish their goal; they are focused on the immediate task at hand. For them the task is what matters and they wish only to know enough technology to execute their task. While this would seem to be a sound strategy, its weakness is that it ignores context; in particular the technological context that exists side-by-side with the business context. The task at hand appears straightforward to the non-techie because they have embedded that task in its business and organizational context. Thinking that the technical task can be reduced to a single recipe assumes that any technology exists on its own; that it has no context.

This is easier to see when you notice that non-techies appear to approach each application or technology as if it were unique. It’s as though each application were a new alien arrival from another universe. By contrast, those who are technologically comfortable view each new application as one new specific creature drawn from a coherent ecosystem. I somethings think of this as seeing “behind the screens.” For those more comfortable in this ecosystem, we have a coherent model of what is going on behind the surface of the screens that dominate our environment.

Techies have learned to see that there are two contexts in play. One for the task at hand and one for the technology ecosystem. What makes this troubling is that acquiring these contexts takes time. You develop a sense for how individual items work within an ecosystem by building a model out of exposure to a long series of case examples. This ties into the notion of digital literacy but isn’t quite the same. That’s a line of thought to be developed over time. My hypothesis is that the path forward starts with helping the non-techie grasp that there is a second context worth being aware of.

Building two bridges between technology and practice

Parallel bridgesI got hooked on the theater in high school. The gateway was a stint as props manager for a local community theater production of “The Importance of Being Earnest.” At the same time, I was a science and math geek in class. I was immersed in the Two Cultures debate long before I even knew that it existed. As much from being shy as for any positive reason, I spent my time in the wings and backstage where I learned about all of the technology it took to make the art happen on stage.

Because theater is live, techies and actors intermingle. There’s rivalry, but it is friendly; the need to work together is obvious. In my college theater group, the tech crew had an award called the “golden crescent wrench.” It was awarded to the cast member who did the most to support the crew. I don’t recall that there was an equivalent award from the cast to the crew, but you know how actors are.

You might think that the divide between techies and performers is narrower in the world of organizations; the interdependencies are just as critical to success. But we see that isn’t the case. Where a theatrical production has a performance to focus everyone on their joint contributions, there is no equivalent in a business. Performance is a continuing process, not a discrete event. Continuing processes push organizations in the direction of increasing specialization, organizational barriers, and silos.

Fewer and fewer people are positioned to view and understand the whole and the multiple magics required to deliver that whole. Technology is something that happens deep in the engine rooms, not nearby in the wings. There is narrow communication and certainly no fraternization between the different specialists.

This has been a long standing problem in organizations. It becomes more pressing as technology becomes more and more visible; as it becomes more of the performance. Our theater analogy suggests that there are at least two problems to address. Techies must learn to appreciate performance and performers must develop some sense for what art becomes possible with each new technological advance.

Dilbert notwithstanding, getting techies to appreciate performance may be the simpler task. A look back to theaters may help. Every theater has a history and that history includes the technological assumptions that were made at its inception. Sometimes those assumptions can be baffling; the Athenaeum Theater in Chicago, for example, has a loading dock that is fourteen feet off the ground and requires a forklift to load in sets for a show. Old theaters accrete new capabilities in layers as technology evolves and budgets ebb and flow. Wishing that a particular constraint didn’t exist is a waste of time; history sets limits on what is easy, hard, and impossible. None of those constraints matter, however, unless they interfere with a particular artistic goal. Understanding the performance goals is the central step in taking advantage of the technological possible.

Developing a meaningful grasp of the technological possible appears to be the more challenging task. Arthur C. Clarke captured this challenge in his third law; “any sufficiently advanced technology is indistinguishable from magic.” If all technology looks like magic to a performer, how can they make sensible choices among options? The answer lies in the practice of stage magicians. They start with the illusion they wish to create and a deep understanding of human psychology. They add a good helping of knowing how technology has been used in the past to create particular effects. Then they proceed to experiment with what might be possible next.

What this becomes is a very grounded approach to innovation. You start with where you are. This includes a sense for what has gone before. You imagine where you hope to take your audience/customers. Only then do you explore what can be done with new technology to fulfill your vision. This exploration does require that performers develop a sense for technology, its history, and its edges. That might require some fraternization with the technologists in the organization.