What is “capitalism?”

I’m giving myself some permission to provide a definition of capitalism as I understand it, and not the absolute best definition of the word. Books have been written about the subject, and the way capitalism works has shifted a lot since the 1700’s when its foundational treatises were written by Adam Smith and his contemporaries. So the definitions I land on are going to be imperfect, incomplete, and only relevant to the present time–a different definition may be more applicable in, say, 1900 or 2100.

What is an “economy?”

First, we need to make sure we’re on the same page about what an “economy” is. The way it’s discussed can make it sound like a vast, unknowable thing that needs to be handled gingerly for fear of ruining it.

An economy is a tool. It falls into the same broad category as a wrench, a government, a personality test, and an atomic bomb. Specifically, an economy is a tool that helps a collective (a family, a city, a country, a world) distribute the resources available to it.

Like other tools, it’s good for some things and not for others. Like other tools, it can be handled properly or improperly. And like a lot of other tools, it’s possible to improve upon its design–either to make it more useful overall, or to make it better at some specific thing you need to use it for.1For example: a hammer with a shock-absorbing grip is an overall improvement, while a rubber mallet is a specific type of hammer that specializes in doing particular tasks well.

This definition is important because of language that’s often used around economic questions. For example, it’s often asserted that socialized medicine would lead to “rationing.” However, since medicine is part of the “economy,” it’s already “rationed”–it’s rationed by the market by setting higher price points for more complicated and rarer products, instead of being rationed by someone making a decision about who gets what. Setting a high price for something is a way of rationing that item to people who have the ability to pay for it.2One of the forces changing the economy today is that there are significant resources on the Internet that don’t need to be rationed, including information obtained through searches, social media posts, even video streams to a lesser extent. Since humanity isn’t capable of consuming all of the resources that one company is capable of producing, this results in a winner-take-all scenario that was extraordinarily rare before about 1998. Google receives nearly 80% of search traffic, but you can’t split up Google’s search engine as though it’s a monopoly because after an initial period of confusion, most of those users would settle on one of the two new engines, making it the “new Google.”

Any given economy has many ways to ration out its resources. In the United States, for example, we ration out emergency services based on people calling 911. In principle, we could let the market take care of emergency services, and you’d either have to sign up for a subscription (like an alarm company) or give them a credit card number before they put out the fire in your house, but at some point we decided to pay for emergency services with tax dollars instead, and ration out those services based on who was asking for them. But whether they’re handled by the market or by 911, centralized emergency services are part of how we distribute resources, and so they’re part of our economy.

Isn’t “capitalism” just a fancy word for “free market?”

“Capitalism” and “free market” are different things. Free market just means that anyone is able to sell a new product to compete with others that already exist in the marketplace. It’s the opposite of “monopoly,” whether that monopoly is owned by a single powerful business or a government entity. It’s a pillar of capitalism, but an economy doesn’t need to be capitalist to have free markets.

I’m not convinced there’s any such thing as a “free market economy.” Just like socialism (discussed below), there are different degrees to which an economy’s markets are free. Some items may be completely unregulated, while other items may be sold from only one vendor or handled outside the market.

Some products and services might be handled better by a free market, while other products and services are better handled by a regulated market or a closed market.3John Stuart Mill says in Principles of Political Economy, “Laissez-faire, in short, should be the general practice; every departure from it, unless required by some great good, is certain evil.” This represents the attitude of later “free-market” economists, including Milton Friedman (who, unlike Mill, held to this dogma his whole life). But careful readers will notice that even Mill allows for regulation when required by “some great good”–and as capital markets have become more complex, “good” and “evil” have also taken on complexity. Mill might have changed his mind about some things if he’d seen the financial instruments that brought about the housing crisis in 2008. You don’t have the ability to sign up for a private fire extinguishing service when you buy a home–that’s a market that isn’t free. Your house may be required to meet local building codes, but you can hire any licensed contractor to make sure it does–that’s a free but regulated market. You can also buy whatever decorations you want to put inside your home that are made by anyone, provided they don’t break other laws (e.g., they aren’t filled with cocaine or made with the fur of endangered animals)–that’s a free and unregulated market.

A market completely free of regulation only remains a free market for a short period of time. Markets left to themselves tend to eat themselves4Through mergers, acquisitions, or winner-take-all competitions. until only monoliths exist that are capable of either decimating or acquiring all their rivals.5Amazon is an example of a company that has been able to acquire or destroy most of its competitors before they came to be big enough to pose a challenge, however, it fails to meet the definition of a traditional monopoly because retailers like Wal-Mart still hold a non-trivial portion of the online market. Regulations are needed to prevent monopolies from forming and to ensure fair competition and consumer transparency (for example, without regulations McDonald’s would never have to tell anyone they’re putting sawdust in their food to make it cheaper).

The idea pushed by neoliberal (free-market) economists like Friedrich Hayek is that the free market is much more effective at distributing resources than any government could ever be. Some economists including Milton Friedman argue that allowing the market to distribute resources also maximizes efficiency and manages to create more resources where previously there were fewer.6A lot of economists believe “efficient market theory” was debunked in the 2008 market collapse. Market pricing itself was thrown off by the irrational behaviors of the actors within the market, proving that market prices could be inefficient for extended periods of time. But there are probably very few people who wouldn’t concede that at least some circumstances require us to offer resources based on need and not the ability to pay, and you won’t find many people who think we should do away with all money and assign resources only based on need either. Most economies will include some elements of free markets and some elements of central planning.

So what is “capitalism?”

Capitalism has a few basic features we’ll be looking at over the next several posts:

  1. Pricing: This is supposed to be the real advantage to capitalism. If a government entity tries to decide how much a new product might be worth, they might end up picking a number out of the blue that really doesn’t reflect everything that goes into making that product or what people could feasibly pay for it. But if a business tries to sell that product on the open market and another business tries to sell a similar product, the two competitors will keep adjusting their prices until they both hit a spot where they’re making acceptable profits or one of them has gone out of business.7This is what’s called a Nash Equilibrium. It doesn’t actually maximize benefit for anyone, it just describes the inevitable price points for competing products and vendors given what one’s competition will do. (There are a lot of problems with market pricing when rational decision-making is compromised and/or when long-term impact is taken into account. We’ll get into that later.)
  2. Value: Everything, no matter how abstract or intangible, can be assigned a monetary value. This includes the value of an idea, a policy, or a human life. It suggests that everything in existence, both real and imagined, can be valued in terms of the amount of money that would be paid for it. Items that have complex value (for example, human beings) are simplified to the relevant value–a human being might be valued based on her accounting skills when she’s being hired at an accounting firm, but would be valued based on her public speaking skills if she’s being hired as a diplomat, or valued based on the cost of a wrongful death settlement when a car company is thinking about recalling its airbags, or valued by the total of all the potential money she would have made over the rest of her life if an economist is trying to determine the economic impact of her early death. It’s a simultaneously practical and horrifying way of thinking about the world.
  3. Means of production: In a capitalist economy, the government doesn’t own assets such as farms, factories, workshops, recording studios, and oil refineries–the means of producing the products we buy. The “means of production,” as they are called, are owned by private citizens, either individually or as corporations, trusts, funds, cooperatives, and similar structures.8The line between “government” and “corporation” is much murkier than politicians make it out to be. It’s not inappropriate to think of the market as a de facto fourth branch of American government. Whoever holds the means of production is able to set rules and determine important aspects of the lives of citizens. In this light, it may deserve the same respect as, say, the judiciary, but it should also be treated as limited.

    The means of production are the source of all wealth. Labor isn’t important to capitalism–it represents a necessary expense in order to extract the value from assets, and should be minimized as much as possible. Capital–or the means of production–is what’s important. Labor is paid with wages that get spent on products. Capital yields profits that are invested into more capital. Labor is always depleting resources while capital is always accumulating them.

    The basic idea put forward by Adam Smith back in the 1700’s is that someone controls the capital–a needle factory, for example–and hires other people to work for him. These employees are able to produce a lot more than they would if they were working alone because they’re able to specialize into the tasks they’re most competent at performing–for example, pulling the wire, cutting the wire, shaping the head, sharpening the tip. Where one person could make, say, 200 needles per day doing every task on her own, four people could make 2,000. In Adam Smith’s ideal world, the workers get more money than they would on their own because they’ve produced more needles, and the factory owner would harvest the excess value–for example, if each employee gets 300 needles worth of wages, the factory owner would still get 800 needles worth of profits.9Adam Smith was idealistic in his illustration. The more likely economic reality of this scenario is that the employees get the lowest wages that the factory owner can offer them, because the factory owner is able to price his needles lower than the needles produced by individual craftspeople. That is, the four people who were selling their own needles at 50 cents apiece now have to compete with a factory selling needles for 40 cents apiece, so they will have to lower their prices to 40 cents and get only $80 a day, or work for the factory for $85 a day, where previously they were able to sell their own needles for $100 a day. The irony is that the factory owner’s profits are lower than they would be if he continued to sell needles for 50 cents and paid the employees $110 a day, but because the free market is competitive, this would leave a lot of room for a competitor to come in and sell needles at lower prices. In any of these scenarios, however, the factory owner comes out better than all of the workers, which is the point.
  4. Capital Accumulation: Not only does capital produce profits, it also accumulates wealth. Most of the things money can buy in the world, from cars to computers to carrots, are depreciating assets. They decay over time or are used up, resulting in a destruction of wealth. But there are some assets, such as land, factories, and brand names, that can become more valuable over time. People who own these things get wealthier without having to work (appreciation),10It wouldn’t be accurate to say that they don’t do any work, but it is accurate to say that the work they do isn’t what makes them wealthy. The capital is what makes them wealthy. Many wealthy people hire an investment advisor to do the work for them and can continue to accumulate wealth while doing no work at all. while people who work and buy things will have to work and buy more things because their things keep getting used up (depreciation). Most people have a little bit of both appreciation and depreciation: you might have a retirement account that holds stocks that appreciate over time, but your clothes depreciate as they wear out and you have to buy new ones. The greater the proportion of appreciating assets you have, the more your wealth accumulates. When defining the valuation of a human being, there are two tiers: People who are valued based on the work they can do, and people who are valued based on the capital they control. A truck driver for McLane Company is valued based on her labor; Warren Buffett, the CEO of McLane’s controlling company Berkshire Hathaway, is valued based on his assets.
  5. Capital Markets: Once separate pieces of capital are pulled together into a mode of ownership, such as a corporation, you can break up that ownership into pieces and sell them to other people. And because everything can be assigned a value, and because selling it on a market in a competitive environment results in an actual price, you can do this with anything. (That’s what got the United States economy into trouble in 2008: lenders taking a bunch of mortgages, packaging them together and selling them, then taking pieces of those packages, packaging them together, and selling them, then hedging some of those against other packages… etc.)
  6. Fluidity: The free market isn’t just for products and capital. Unlike feudalism, people can easily move between different employers under capitalism, even without having to pick up and move to another part of the world. The flip side of this is the degradation of community: since people need to be able to move around easily, your bonds with your neighbors are weaker. Everyone is responsible for her own fate, and many more of our interactions are transactional: instead of confiding in your local priest or preacher, or talking things through with your neighbors, you might pay a therapist or life coach to listen to your problems.11This is what sociologist Ferdinand Tönnies called “Gesellschaft,” which represents a broad and anonymous society, in opposition to “Gemeinschaft,” which represents a specific and familiar community.

As I’ve described in a previous post, capitalism effectively decoupled and abstracted political and economic realities as they existed under feudalism. Where feudalism required wealthy people to own land, capitalism enabled wealthy people to maintain and grow their wealth independent of land.12This was an important transition with the ending of the Malthusian era–where the total wealth available to a population essentially came down to the ability to produce food. Once we invented better ways to produce food, wealth became disassociated from land. This also means violence takes on a different form under capitalism, and warfare and destruction is less direct.

Why capitalism?

Capitalism is underpinned by a belief in the effectiveness of rational (or enlightened) self-interest: that people make the best decisions for their own needs with the information they have available to them.13That people consistently act against their own rational self-interest does not dissuade believers in capitalism. Usually there is a rationalizing belief that irrational actors get edged out of the market, that poor people are either lazy or need to educate themselves, that the “self-interest” part can be irrational but a person will still act rationally to get it, etc. The result is the “invisible hand” of the marketplace moving to fulfill the needs people are ready to pay for: if all of a sudden everybody wants to eat radishes, the price for radishes will go up and more people will grow radishes because they’ll get a better price for them. (In later posts we’ll look at how this breaks down, especially in the age of the Internet.)

Capitalism was originally developed in opposition to the idea of a central regime–usually a monarchy–attempting to make the best determinations about how resources should be used. Under feudalism, a monarchy might grant a certain portion of land to a knight, who would be the caretaker of that land and all the commerce that was conducted on it. As opposed to the visible hand of a ruler distributing resources under a feudal economy, capitalism depended on an “invisible hand” distributing resources based on how much people would pay for a product or service.

It’s helpful to see capitalism as the product of its particular time:14To put capitalism into perspective next to similar ideas in physics, capitalism is like Newtonian physics. We still use Newtonian physics for some purposes, but imagine if technologies refused to take relativity or quantum mechanics into account–we wouldn’t have inventions like GPS or even modern computers.) the Age of Enlightenment, when philosophers and lawmakers alike saw education and rationality as a way of elevating humanity((Enlightenment rationality has a long and problematic history with the concept of race, in which white Europeans were thought to carry the “white man’s burden” of rationality into the rest of the world and elevate all other races of humanity beyond what they would be capable of achieving without the salvation of white rationalism and scientific progress. This belief has influenced modern politics among both conservatives and liberals, long after it became unfashionable to be explicitly and unapologetically racist. out of conflict, poverty, and all our other problems. If most men15Enlightenment rationality also has a long and problematic history with sexism, in which for a long time only men were believed to be capable of rationality, and even today women are frequently dismissed as emotional or irrational. On the other hand, feminists such as Mary Wollstonecraft made appeals to rationality as a way of bolstering their arguments. Even so, it’s been argued that rationalism was the language of the time and not a prerequisite for the success of feminism. became educated and rational, the thinking was, they would be able to lead less rational men and women toward a collective social ideal.

Capitalism was successful in its time due to the invention of new technologies and methods of organizing. Today, the invention of new technologies and methods of organizing are among the reasons we need to develop new economic models and methods that leave capitalism behind.

What about socialism?

There’s a lot of discussion about socialism in the political arena these days. To some people it’s a scary word. Others embrace it whole-heartedly. But what is it?

Socialism is a version of capitalism where the government owns a stake in some industries. Depending on how you define “industry,” most economies are socialist to greater or lesser degrees. For example, in the United States, the government owns most of the military industry and the education industry, as well as a significant stake in the amusement and tourism industry in the form of parks, museums, and monuments, and a pretty big piece of the transportation industry in terms of roads, bridges, waterways, and airports. This isn’t the degree to which the United Kingdom is socialist, where the government owns some television and radio broadcasting channels and the lion’s share of the medical industry, but the United States government owns a pretty large portion of certain industries and so can be called a socialist government already. So the discussion isn’t about whether to be socialist, but about which industries benefit from government involvement and to what degree.

To be clear, I intend to look at issues with capitalism itself, which means the solution may not be socialist in nature. It might require a re-thinking of the assumptions of capitalism, which would necessarily challenge socialism as well.

What about communism?

In my view, communism is also a modification of capitalism. The means of production are held in common by the people, but otherwise it’s intended to work more or less the same way.

In its purest form, communism requires the means of production to be owned by a democratic government and made available to the people based on who will make the best use of it. For example, if one person wants a piece of land to build a farm and another person wants to build a factory on it, there would be a rational, unbiased way of determining whether the factory is needed, whether it’s a good location for a factory, whether the soil and rain conditions are good for farming, etc., in order to decide which person should use that piece of land. In practice, centralized decision-making is the sort of outrageous idea that could only be conceived in an early Industrial Age mindset, when it seemed like rationality would take over the world and everything would become quantifiable.

In the capitalist ideal, some profits flow to workers by way of increased wages, while the rest flow to the owners by way of capital accumulation (whether or not that actually works is another question). In the communist model, the state takes the place of the owner. It’s a shortcut to the dark capitalist utopia envisioned by Arthur Jensen in the movie Network: “one vast and ecumenical holding company, for whom all men will work to serve a common profit, in which all men will hold a share of stock, all necessities provided, all anxieties tranquilized, all boredom amused.”

The real flaw in communism is not that it kills worker motivation but that having a centralized decision-making process about who gets to use assets is extremely prone to corruption and the short-sightedness of the central decision-makers.16Even if it is, in principle, controlled by all the people. In principle, the United States government executes the will of the people, and yet somehow only 17% of people as of the latest Gallup poll approve of Congress. That number hasn’t been above 50 percent since 2003. It’s hard to imagine people would be happier with Congress if it were directly in charge of all the jobs, resources, and products in the country. Put the no-nonsense Jack Welch in charge and you might stifle innovation for a decade, put the visionary Steve Jobs in charge and you will get a tremendous amount of innovation that fits his particular vision but nothing outside of that. Even the perfect candidate would be limited by human decision-making, and let’s be honest when assessing our human tendencies toward rationalizing corruption.

What’s wrong with capitalism?

It’s been my conviction for the past several years that we are seeing a shift that heralds the end of capitalism as a ruling ideology in our world. This isn’t because we’re getting smarter–it’s because capitalism, and its variations including socialism and communism, fail to address our economic realities.

Some people think this is a bold claim, some people think it’s ridiculous, some people might even think it’s anti-American. But to me it’s about as obvious as pointing out that winter is coming: it’s getting colder, the leaves are falling, we all know where that leads. And if capitalism is ending, we’d better get ready to replace it with something.

I don’t have the solution for what replaces capitalism. But this is a conversation I’m ready to have with other humans who want to see a world in which ordinary people can produce meaning and improve our collective quality of life.

What follows will be a series of posts pointing out what I see as some of the key strengths and weaknesses of capitalism, so that we can begin to talk about how we might do better in the future.

If I left anything important out of my definition of capitalism here, please feel free to comment below. Keep the comments on this post to a discussion of the definition here–there are going to be a lot of posts in this series and there will be plenty of time for criticism and theorizing. It will be helpful to have a solid working definition to start.

Talk to you soon, fellow humans.

References

References
1 For example: a hammer with a shock-absorbing grip is an overall improvement, while a rubber mallet is a specific type of hammer that specializes in doing particular tasks well.
2 One of the forces changing the economy today is that there are significant resources on the Internet that don’t need to be rationed, including information obtained through searches, social media posts, even video streams to a lesser extent. Since humanity isn’t capable of consuming all of the resources that one company is capable of producing, this results in a winner-take-all scenario that was extraordinarily rare before about 1998. Google receives nearly 80% of search traffic, but you can’t split up Google’s search engine as though it’s a monopoly because after an initial period of confusion, most of those users would settle on one of the two new engines, making it the “new Google.”
3 John Stuart Mill says in Principles of Political Economy, “Laissez-faire, in short, should be the general practice; every departure from it, unless required by some great good, is certain evil.” This represents the attitude of later “free-market” economists, including Milton Friedman (who, unlike Mill, held to this dogma his whole life). But careful readers will notice that even Mill allows for regulation when required by “some great good”–and as capital markets have become more complex, “good” and “evil” have also taken on complexity. Mill might have changed his mind about some things if he’d seen the financial instruments that brought about the housing crisis in 2008.
4 Through mergers, acquisitions, or winner-take-all competitions.
5 Amazon is an example of a company that has been able to acquire or destroy most of its competitors before they came to be big enough to pose a challenge, however, it fails to meet the definition of a traditional monopoly because retailers like Wal-Mart still hold a non-trivial portion of the online market.
6 A lot of economists believe “efficient market theory” was debunked in the 2008 market collapse. Market pricing itself was thrown off by the irrational behaviors of the actors within the market, proving that market prices could be inefficient for extended periods of time.
7 This is what’s called a Nash Equilibrium. It doesn’t actually maximize benefit for anyone, it just describes the inevitable price points for competing products and vendors given what one’s competition will do.
8 The line between “government” and “corporation” is much murkier than politicians make it out to be. It’s not inappropriate to think of the market as a de facto fourth branch of American government. Whoever holds the means of production is able to set rules and determine important aspects of the lives of citizens. In this light, it may deserve the same respect as, say, the judiciary, but it should also be treated as limited.
9 Adam Smith was idealistic in his illustration. The more likely economic reality of this scenario is that the employees get the lowest wages that the factory owner can offer them, because the factory owner is able to price his needles lower than the needles produced by individual craftspeople. That is, the four people who were selling their own needles at 50 cents apiece now have to compete with a factory selling needles for 40 cents apiece, so they will have to lower their prices to 40 cents and get only $80 a day, or work for the factory for $85 a day, where previously they were able to sell their own needles for $100 a day. The irony is that the factory owner’s profits are lower than they would be if he continued to sell needles for 50 cents and paid the employees $110 a day, but because the free market is competitive, this would leave a lot of room for a competitor to come in and sell needles at lower prices. In any of these scenarios, however, the factory owner comes out better than all of the workers, which is the point.
10 It wouldn’t be accurate to say that they don’t do any work, but it is accurate to say that the work they do isn’t what makes them wealthy. The capital is what makes them wealthy. Many wealthy people hire an investment advisor to do the work for them and can continue to accumulate wealth while doing no work at all.
11 This is what sociologist Ferdinand Tönnies called “Gesellschaft,” which represents a broad and anonymous society, in opposition to “Gemeinschaft,” which represents a specific and familiar community.
12 This was an important transition with the ending of the Malthusian era–where the total wealth available to a population essentially came down to the ability to produce food. Once we invented better ways to produce food, wealth became disassociated from land.
13 That people consistently act against their own rational self-interest does not dissuade believers in capitalism. Usually there is a rationalizing belief that irrational actors get edged out of the market, that poor people are either lazy or need to educate themselves, that the “self-interest” part can be irrational but a person will still act rationally to get it, etc.
14 To put capitalism into perspective next to similar ideas in physics, capitalism is like Newtonian physics. We still use Newtonian physics for some purposes, but imagine if technologies refused to take relativity or quantum mechanics into account–we wouldn’t have inventions like GPS or even modern computers.) the Age of Enlightenment, when philosophers and lawmakers alike saw education and rationality as a way of elevating humanity((Enlightenment rationality has a long and problematic history with the concept of race, in which white Europeans were thought to carry the “white man’s burden” of rationality into the rest of the world and elevate all other races of humanity beyond what they would be capable of achieving without the salvation of white rationalism and scientific progress. This belief has influenced modern politics among both conservatives and liberals, long after it became unfashionable to be explicitly and unapologetically racist.
15 Enlightenment rationality also has a long and problematic history with sexism, in which for a long time only men were believed to be capable of rationality, and even today women are frequently dismissed as emotional or irrational. On the other hand, feminists such as Mary Wollstonecraft made appeals to rationality as a way of bolstering their arguments. Even so, it’s been argued that rationalism was the language of the time and not a prerequisite for the success of feminism.
16 Even if it is, in principle, controlled by all the people. In principle, the United States government executes the will of the people, and yet somehow only 17% of people as of the latest Gallup poll approve of Congress. That number hasn’t been above 50 percent since 2003. It’s hard to imagine people would be happier with Congress if it were directly in charge of all the jobs, resources, and products in the country.

There Are No “Good People”

A lot of people are surprised when they discover that I don’t believe in “bad people.” I don’t believe there is such a thing as an irredeemable, fundamentally broken individual who just needs to exit the human race as quickly as possible.

“Not even Hitler?” the hypothetical objector exclaims, appealing to Godwin’s Law right out of the gate.

“No, hypothetical person,” I reply. “Not even Hitler.”

I’m raising this point in the midst of sexual assault scandals rocking everyone’s world as if we should be surprised that a culture that scarcely thirty years ago didn’t widely recognize sexual harassment, that to this day continues to ask victims of rape what they were wearing and whether they should have gone into the room with him, conditions its men to respect their own sexual urges over the self-sovereignty and safety of others.

“But I’m a good man,” cries Louis C.K., Bill Clinton, George H.W. Bush, Al Franken, George Takei, or whatever respected man is currently under discussion as having forced himself sexually against others.

Well that right there is your problem. The flip side of the notion that “bad people” don’t exist is that “good people” don’t exist either. There are just “people,” with all the mess of bias, emotions, desires, and other irrationalities.

I don’t mean to excuse any of the horrible things done by these or any other people. But whenever I give an apology with the claim, “I’m a good person”–or anytime I defend someone saying, “He’s a good person”–I’m implying there are “bad people” out there who are the ones who do these things, and the bad thing I did isn’t part of who I am. But clearly it is part of who I am. Because I’m the person who did it.

Of course, there are also people who think they’re the “bad people.” These people go home and love their spouses, children, or pets with complete selflessness. They give to poor people or help others avoid the mistakes they themselves made, often with the reasoning that “just because I’m a bad person doesn’t mean everybody else has to suffer.”

In a way, both these narratives exist because they save us energy. If I’m a “good person,” I don’t have to stop and think about what I’m doing, because by virtue of “being good,” I won’t ever do anything bad on purpose. If I’m a “bad person,” I don’t have to stop and think about what I’m doing either, because even if I try to do something good it will inevitably be corrupted by my “bad” nature.

The most terrible people in the world have almost always been “good people” by their own reckonings. Tyrants, slave traders, and genocidal maniacs have all reasoned that because they were essentially “good,” the actions they were taking must be justified.

It’s this kind of “goodness” that prevents us from making progress against racism, sexism, classism, and all the other dysfunctional “-isms” that plague our culture and keep crushing human lives under their weight. Your mom spouts vitriol about the Vietnamese family who moved in next door, but she’s a good person. Your buddy touches women inappropriately all the time but hey, he’s a good guy. Your boss would rather vacation in ever more remote tropical islands than lift a finger to help people less fortunate, but he’s always nice to you at work, so he’s a good person too.

Do you consider yourself a “good person?” If so, I recommend seeking treatment immediately before the condition worsens. Talk to a therapist or religious leader, and if they in any way imply it’s a simple thing to do, get a second, third, or fourth opinion as needed. Read Thich Nhat Hanh or Thomas Merton, follow the fantastic On Being podcast and blog, look in whatever texts you consider sacred for the words that are spoken to you and not the words that are spoken to others.

Give up being a “good person” or a “bad person” and work on becoming “good at being a person”–someone who has learned to accept his irrationalities and idiosyncrasies and limitations, who always acts with empathy, who considers the people affected by his actions before taking action. To quote Kendrick Lamar, “Be humble.”

I struggle to this day with the belief that I’m a good person. Sometimes I have to catch myself when I think that the things I believe or the lifestyle I embody mean that I’m a good person, incapable of doing wrong because it’s simply not in my nature. There are also times when I’ve been shaken to my core to think that I not be a good person–that I’m not capable of doing anything right, that I’m useless as a human being. It took me years of growth and practice to recognize and ingrain in myself that I was neither good nor bad. And as I began to leave behind rightness and wrongness (to allude to the Islamic mystic Rumi’s famous poem), I also began to find I was calmer, more focused, more energized by the change I could help to create in the world and less burdened by self-doubt.

This isn’t a quick process–it means dedicating yourself to learning how to be human the way you might dedicate yourself to learning guitar or glass blowing; and it means you have to keep practicing instead of depending on your inherent “goodness.” But it’s the one skill literally everyone needs. It’s the one skill that matters most to our collective future. And you can’t be an effective leader of your home, your business, or your country without it.

If you’re looking for help with this, please post in the comments below and I’ll try to provide some more resources.

Emerging and Disrupting With Purpose

The most disruptive idea in the market right now isn’t a new technology. It’s organizations that can disrupt themselves.

In my last post (which was some time ago), I talked about collective intentionality at the end of a series of posts about emergence. Before I move on, I want to bring the two ideas together.

Emergence is often discussed in scientific contexts as something which doesn’t have purpose on an individual level–only the collective appears to have purpose, as with slime mold finding the shortest path to food despite each individual cell having no such specific intention.

The interesting thing about intentionality is that it doesn’t require conscious thought–as a matter of fact, in its best form intentionality is close to unconscious. Intentionality is directed existence, or “being about something.” In philosophy, “intentionality” is typically used in the philosophy of language, for example, the word table is “about” a table. The word isn’t a table, but to signify a table is the word’s reason for being. If tables didn’t exist (even as a concept), “table” wouldn’t be a word, it would just be a jumble of letters or sounds.

Similarly, when we choose to be intentional, what we are choosing is to be “about” something on a fundamental level. It happens at a more basic level even than a typical mission statement. This “being about” is something Simon Sinek describes in his “Golden Circle” approach: the “why” toward which all action in an organization is directed. It’s true that there isn’t an intelligence directing the movements of slime mold or the flocking of birds, but there are many individual parts combining a few simple rules with a collective objective: to find food, to find warmth, to survive and reproduce. Without intentionality, the movement of slime mold or the flocking of birds would never happen: the birds would fly off in their own directions and the mold would grow aimlessly until it dies.

As humans, our intentions can be much more varied, but it still needs to be fundamental. An organization, for example Gravity Payments, could have an internal manifesto with guiding principles, objectives, goals, key performance indicators, and so on, but all of these are worthless if they don’t draw clear circles to highlight the central “why” of the organization: to simplify transaction processing. Everything CEO Dan Price says to the members of the organization must reinforce its central narrative and focus every individual’s actions toward achieving that purpose. Only when everyone in the organization is moving toward the same purpose, does emergence propel the whole organization.

By establishing intentionality and changing the structure of an organization to better facilitate emergence, the organization will be prepared to increasingly disrupt itself. This doesn’t happen automatically. There are other factors to consider, particularly the diversity of perspective, the responsiveness to external realities such as customers and market conditions, the potential for peaceful revolution within the organization, and so on. These factors can affect the viability of an organization whether it’s a garage-based startup or an entire nation-state.

What all this means is that traditional organizations have it backward: Strategy will take care of itself, if you take care of the people. The decisions made by the so-called executive level will bubble up from what were previously considered the lowest levels of the organization. This requires re-thinking the organization’s relationships to some pretty fundamental principles, including power, employment, and compensation.

I’m eager to get readers’ thoughts about this approach to adaptive organization. What possibilities of this approach excite you? In what ways are you skeptical about this approach? What about the idea requires more clarification?

Be What You Intend to Be

Much of what goes on in a traditional organization is unintentional. That is to say, it isn’t an action that someone has decided to take in order to contribute to the well-being of that organization and its stakeholders. It’s operating on default.

Ironically, unintentional behavior can often be the result of trying to clamp down on unintentional behavior. On the other hand, it can just as easily be the result of leaving people isolated and expecting them to do their best work without any assistance or support.

The road to a more intentional organization is one described ideologically by business greats from Warren Buffett to Richard Branson. Here is the idea as verbalized by Steve Jobs:

It doesn’t make sense to hire smart people and then tell them what to do; we hire smart people so they can tell us what to do.

Taken to its logical conclusion, this idea is counter to the operation of a traditional organization. Traditionally, decisions get made and orders pushed down the chain of command; results come back up and get pieced into something like the final result that the person at the top of the chain wanted.

Counter-intuitively, the result of the traditional approach is that much of what happens in the organization is unintentional. People who wait for orders don’t make the best use of their own time; and the people above them, who don’t have the perspective of each individual’s point of view, don’t make the best use of their time either. People fulfill their immediate expectations without a view of what’s good for the whole. What’s more, managers often don’t communicate all their expectations, and the results reflect the holes in each subordinate’s understanding of the tasks assigned to him.

Becoming intentional means, at least in part, understanding myself, acknowledging and accepting what I am, and developing upon my strengths. As in the Cherokee proverb of the two wolves, I become better by feeding what is good within me. It’s not a choice I make when I’m faced with a hard question, it’s a choice I make by the way I condition myself to face the hundreds of little choices throughout the day.

The same is true of an organization: I have to feed what is best in my organization and what is best in the individuals within it.

This is one reason organizations that focus on facilitation can be much more effective than traditional organizations. Instead of “managing” in the traditional sense, leaders help people to do and become their best, guiding their individual work toward the ultimate good of the organization as a whole and helping to connect it to the work of others.

What this means for a leader is that I am first of all responsible to my people rather than for them. (Responsibility for my people is still important, though it’s mostly externally-facing: followers want leaders to have their backs.)

Whereas a traditional organization is merely, as Emerson put it, “the lengthened shadow of one man,” an organization of facilitation is an attempt to leverage the power of community toward a common goal. That makes the intent of each individual important to the whole. Each level is intentional about its own goals and behaviors, and each subsequent level is there to help the previous level attain its goals and bind efforts together.

Here are a few risk factors for unintended behavior, and what you can do about them:

  • Fear. When people are afraid of something, they tend to either destroy it or hide it. I have never seen either of these behaviors yield positive results in an organization. If the people working with you act fearfully, address it head-on. Learn what they are afraid of. Dig into the root cause, too–few people are afraid of disappointing a customer so much as they’re afraid of what might happen to them. If you start to notice a lot of people having similar problems, you have a systemic fear on your hands–usually one that has to do with trust within the organization–that requires a change.
  • Inconsistent culture. People are more willing to take personal risks if they feel anchored and supported. That has partly to do with knowing that the people around them have their back–even people who may be on a different team, or come from a very different background. Your hiring practices and cultural guidelines need to be spelled out so that the people you hire are people you’d choose to weather a crisis, not just people who would have fun together at happy hour. More than that, everyone in your organization needs to be telling the same story and believe in the same destiny.
  • Too much process. Process can be a good thing if done correctly–if the process represents a best practice, serves the people, and is capable of evolving. But if you need a process to mitigate risk, that means you already have unintended behaviors–and adding a process could make the issue worse, as people attempt to short-cut or circumvent the process in order to get their work done. (Ask yourself: Is the process an invention or a control?) Pare down or eliminate any processes that get in the way of doing good work, and instead focus on gaining buy-in from your ostensibly reliable (you did hire them, right?) employees as to how to avoid putting your community at unnecessary risk.
  • Over-management. If responsibility for my efforts always goes up to my manager, my natural human response is to fight against that control mechanism. I might give up on doing anything that isn’t assigned to me, I might deliberately procrastinate or slack off, or I might start looking for other jobs. (The top cause of burnout isn’t over-working, it’s lacking control over or engagement with your work.)1 A quote from a study in the Indian Journal of Industrial Relations: “Burnout can be minimized/avoided if individuals develop a high level of involvement in their jobs and they are able to identify themselves psychologically with their jobs.” Adding controls and oversight to prevent me from doing anything but the work I’m supposed to be doing will provoke a desire to rebel against them. Try cutting out levels of management and finding ways to prevent micromanagement, or better yet, train your “hierarchy” to be a facilitating structure instead of a managing structure. If you have good people, you won’t need to control them; and if you stop controlling them, you’ll find out pretty quickly who’s good and who isn’t.

The only way you’re going to get more than a handful of people to be fully engaged in accomplishing a goal is to get them to buy into that goal and work toward it on their own motivation. In other words, hire good people and let them tell you what to do. Think of it this way: As long as I hold the power to fire my leader, what do I lose by being a servant?

What reservations do you have about making this kind of change? Did I miss something? I’m looking forward to getting your reactions in the comments.

References

References
1  A quote from a study in the Indian Journal of Industrial Relations: “Burnout can be minimized/avoided if individuals develop a high level of involvement in their jobs and they are able to identify themselves psychologically with their jobs.”

Nourish the Unexpected: Facilitating Emergence

It’s not quite enough to stop controlling in order for the people in your organization to do self-managed, unprecedented work. Facilitating their work is also critically important.

Facilitation nourishes and encourages people in several ways. It feeds the part of us that wants independence and mastery because a more experienced manager/co-worker is helping us with a goal instead of exercising command over it. It feeds the part of us that wants social validation: if someone is helping us accomplish a goal, it tells us the goal is worth accomplishing. It even feeds the part of us that’s lazy–that is, the part that wants to accomplish our goals while using the smallest amount of energy possible.

Think of your organization as a computer:

Algorithmic Containmentphoto credit: Algorithmic Contaminations via photopin (license)

A computer is highly structured, functional, and hierarchical, but in order to continue running the latest software, it has to be continually upgraded and redesigned. A computer doesn’t grow on its own. This is the traditional organizational model.

Now think of your organization as a garden, growing all kinds of plants:

English gardensphoto credit: Gardens at Canons Ashby via photopin (license)

You can select the kinds of plants to grow, you can fertilize and water them to help them grow faster (but not too much or it will choke them), and you can trellis them to help them grow in a certain way, and you can prune them when they grow in ways that aren’t fruitful. Plants in a garden grow on their own, but left untended, weeds will sprout up and diseases will take hold and some plants won’t receive enough nutrients.

(Doesn’t this second metaphor sound like your organization already? Why do we so often feel like we need the additional layer of inorganic structure, except that we want an illusion of control that we don’t actually have?)

Facilitation is the art of pruning, trellising, weeding, hedging, fertilizing, and helping your organization grow. You don’t order a pear tree to blossom, you don’t command bees to pollinate, you don’t provide tomatoes with minimum production quotas. You also don’t give them these initiatives and then go back inside your house and expect everything to work unless you’re told otherwise.

A similar approach can be used to grow your organization.

Consider an example of a great gardener: Brian Grazer, movie and television producer and co-founder of Imagine Entertainment. Grazer’s preference to ask questions and make requests rather than give orders helps gain buy-in, makes people feel respected, and allows him room to doubt his knowledge without being hands-off. As a leader, he uses questions and requests as a form of trellising, guiding people to grow in a certain direction rather than commanding them to do so.

The kind of gardener you become is up to the specifics of your situation. So long as you’re seeking to grow your people and your organization, you will treat them with care and make sure they have the resources, support, and guidance they need to grow in the way that’s best for them. Being neglectful and being overattentive both have their hazards.

Have you ever worked with a good “gardener?” What have you learned from these people who dedicate themselves to growing their people, their organizations, and even their strategies?

Hiring for a Unique Culture

Culture is an emergent phenomenon. It exists between the people who make up that particular culture, and evolves based on their interactions–the mythology, folk knowledge, and traditional practices they create and pass between themselves. If you hire based on skill alone, your internal culture will look pretty much like the rest of your industry, because it will be populated with the same kinds of people.

Unlike the Industrial Age, hiring today isn’t picking up a part to put into an already-designed machine to make the machine run. Hiring into an emergent environment only happens when the candidate fits both the current culture and the future culture. Emergent strategy depends on the people within the organization working with and off of one another to yield unplanned results.

Here are a few tweaks to your hiring practices that may yield better results:

  1. Don’t appeal to everybody. Many organizations just want to be liked by everyone. They want to be the place where any individual out there would love to work. Don’t do that. Your organization is unique, and you want people who fit that collective vision and identity. Netflix asserts very clearly that its culture isn’t for everyone, but that is precisely what makes its culture all the more appealing to those who do fit. Figure out now why people wouldn’t want to work in your organization, and you’re on the way to creating a unique and powerful culture.
  2. Fill blind spots, not roles. Roles are a collection of responsibilities and skills that fit a pretty standard definition. Blind spots require a more complete understanding of your team and organization. Simply put, a blind spot is something you need that you don’t have, at the broadest definition that is required (e.g., do you really need someone with three years of Trello experience or do you just need someone who’s comfortable with agile project management?). A blind spot may be a specific competency, like a specific piece of technology, or it may be a tweak to the chemistry of the current team–for example, a more outgoing individual that will facilitate communication between the more introverted members of a remote team. It strips away the expectations that come with hiring someone into a particular role, allowing the new hire to integrate more organically with what’s already going on in your team for the first few months until they have a rhythm going.
  3. Advertise your vision, not your requirements. Anyone who isn’t excited by your specific vision doesn’t belong to your culture. And don’t just advertise the vision of your company. If possible, state succinctly but with enthusiasm what your vision is for the team and even for the specific role. A less-skilled candidate who is energized by the collective vision will be twice as valuable as a more skilled candidate who just wants a new job. And bear in mind that a long list of qualifications belies a search for an interchangeable part. If you want your candidates to get excited about a position, pare it down to your vision and the key blind spots you’re trying to fill. Leave room for the candidate to surprise you.
  4. Interview thoroughly. The hiring process I’ve seen averages two interviews. Google suggests no more than five–and then actually goes on to interview candidates five times, looking for factors including raw skill, problem solving ability, and cultural fit. In an adaptive organization, you’re going to want to take advantage of four or five interviews in order to thoroughly vet the skills, the personality, and the chemistry with the current environment.
  5. Weigh potential. Today the pace of change in technology and the economy means being able to learn what’s needed for the future is more important than having what’s needed in the present. Your people will not only need to adapt as things change, but they will need to create change themselves. And then they will need to live into that change. If the candidate doesn’t have what’s needed to adapt to whatever his role will be in three years, he may not be the best fit.
  6. When in doubt, leave them out. Don’t hire a candidate unless they leave you no other choice–by which I mean, she is such an excellent fit for your organization that you couldn’t bear to let her take another job. Turn away business before hiring someone who doesn’t add to your culture. Adaptive organizations thrive based on the number and quality of connections between employees. Hiring someone who isn’t going to improve your internal network is poison to your long-term goals.
  7. Enlist your recruit’s help. Zappos offers a $2,000 bonus for new hires to quit. The idea is that a new hire will take the money if they don’t feel that they are a good fit for the culture or they don’t believe in the long-term potential of working with the company. In the long run, the occasional $2,000 quitting bonus saves the company a lot of money on people that might otherwise be a drag on the culture. A new-hire quitting bonus might not work for you, but you should still look for ways you can work with a new hire to ensure he’s the right person, and part ways amicably if he’s not.

Filling your organization with effective people who fit with the people around them and are excited about a common vision is the basis of any good culture, not just in an adaptive organization. But because of the importance of emergence in adaptive organizations, getting the mix of people right for your culture is a crucial requirement for success.

EDIT: Reader Brian Gorman offers two additional points to consider: “Having spent more than four decades living in the world of organizational change, I would add two more to his list. 1. Hire for the culture that you want, not the culture that you have. 2. Hire for resilience; you need people who can learn new skills, and shift their mindsets, as your organization continues to change.” I would add a caveat to the first that anyone you hire needs to be able to work in the culture you have today, or she’ll be out the door as soon as she can–which makes finding adaptable people all the more important during a period of change.

Which of these points do you find is most important or illuminating? Are there any important points about hiring for culture that I’ve missed? Do you disagree with my points? I look forward to discussing it with you in the comments.

Listen to the Opinion, Speak to the Experience Part 2

“For acquired knowledge cannot be divorced from the existence in which it is acquired.” – Dietrich Bonhoeffer

It’s been pointed out to me that my previous post is a bit confusing. Granted, it’s a topic that’s probably worth writing several books, and a skill that can take years of personal development. But I want to drill down to a core that’s useful even in the short-term.

There is no such thing as objectivity among humans. (As software people are fond of saying, “It’s not a bug, it’s a feature.”) In collaboration with one another, we represent a wide array of experiences and we have each filtered out what we have found to be the salient points that we apply as broad rules of the world. This is a cognitive belief, or what we call an “opinion.”

But behind the opinion is the semi-instinctual gut feeling that is our initial filter. This is an emotional belief: a reaction, derived from our experience, that we first feel and then attempt to understand through logic and words. (I say “semi-instinctual” because highly developed, balanced individuals can actually inform and change their emotional beliefs.)

So, when we are dealing with people–whether it’s working toward consensus at a meeting, motivating a co-worker, or addressing a client’s concerns–we are dealing with a complex of emotional beliefs, masquerading as opinions.

Particularly in business, we’ve been taught to act as though the world is a rational place–or at least, that it can be made rational. And so when we encounter conflicts in opinions, we take all the facts and information from those opinions and try to reconcile them. When we can’t, we start throwing out those that don’t agree with our views until we come up with a patchwork of ideas that meshes together. Or worse, we split the difference between competing opinions and call it “compromise” just to get people on board.

The message of this process is that not every experience is valuable. If I’ve contributed my opinion and it’s been thrown out, it means that I am wrong and my perspective is useless (according to whoever is throwing it out).

But there are reasons for every opinion that are relevant to each solution. If I have a difference of opinion from everyone else in the room, it means I have an important experience to contribute–even if my opinion, the product of that experience, doesn’t bear with reality.

So much of our focus in management (and even leadership) is on getting the facts, the efforts, the opinions to fit together into a whole. And so we may often end up with solutions that are like an exquisite corpse: a too-elaborate tacking-together of mismatched parts that could never be functional.

What if, instead of trying to mesh together a patchwork of opinions, we instead undercut the opinions and worked to form an understanding of the human experience underlying the problem? What if there were no relevant experiences that didn’t matter? What if an opinion, which we often use as a way of rationalizing our emotional beliefs, is actually a lens we can look through to find the experiences that are most important to what we’re doing? Could we find a way to address the whole reality of our human experience of a problem, instead of presuming that our years of experience or our level of mastery elevate us toward perfection?

I’m not sure of the answer, but I do know that developing my own emotional maturity and my own ability to see through the eyes of others is one of the skills I value most in my business experience. This post is my own opinion: the way that I make sense of my experience. I look forward to being informed by yours.

Listen to the Opinion, Speak to the Experience

We each have at least two sets of beliefs: cognitive beliefs and emotional beliefs. Which one do you believe controls you?

You’re likely to say your cognitive beliefs–because it’s your cognitive self that is analyzing the question, and that part of yourself wants to believe it is dominant. That it has the power to bully your emotional self into agreeing with it.

But if we were all governed by our rational selves, we would look at the same facts, see the same things, and form the same opinions. There would be no public debate, and we certainly wouldn’t have the incessant raving of rabid pundits on every form of media.

My emotional beliefs determine which facts are more important than others, which virtues are more significant than others, which vices are more destructive than others. They are the substance of all my conflicts with my lover, my mother, my best friend, my boss.

But my cognitive self wants to believe it’s in control. And so it formulates cognitive beliefs–what we call “opinions.” These opinions form a shield around our emotional beliefs, which is why we hold onto opinions so dearly. To expose our emotional beliefs would leave them open to invalidation.

To measure and count and address the opinions of people is to be a representative, not a leader. A leader isn’t concerned with opinions, she is concerned with experiences.

Consider the myriad experiences in the debates over immigration: legal immigrants with illegal-immigrant friends and family who risked their lives to cross the border; legal immigrants who struggled through a complex system; immigrants whose legal status is threatened or has slipped; union workers put out of work by immigrants; refugees from physical and economic violence; citizens who live close to violent border towns; illegal parents of legal children; kids who grew up with immigrant parents or grandparents. Every one of these people (and more) has his own experience that informs his opinions about immigration.

Phenomenology, the study of experiences, adjoins the fields of philosophy and anthropology. It’s a field that has gained some notoriety lately through books such as The Moment of Clarity, which describes case studies using anthropological techniques to inform business decisions at companies like LEGO and Intel. It also helps to turn this inquiry inward, to observe not just the experiences of customers but the experiences of the people within my own organization.

When I shout an opinion at you, what I’m saying is, “This is the best way I can see to reconcile my own experience with what I know about the world.” If you attempt to address my opinion, you are saying, “You just don’t know enough about the world.” When you attempt to address my experience, however, you are asking, “How can your experience inform what we know about the problem?” Doing so not only moves a team toward consensus, but promises a better solution.

Of course, it’s not wise to ask, “What experience do you think is driving your opinion?” Nobody wants to turn a business meeting into a therapy session. Instead, try to live like an anthropologist among those you would seek to lead. Watch how they work and observe their environment. Hear the patterns of their complaints and identify their core beliefs. Consider their incentives and responsibilities. Try to become one of them (without taking it overboard and acting like you can do what they do). Always, always ask, “Why?”

Over time, and with practice, you will start to hear the experiences. And as you do, it will become possible to address problems in a real, substantial way, rather than simply speaking to the opinions.

If you agree or disagree, please share your own experience in the comments so that we can all learn from it.

Do This, Not That: Market Versus Social Norms

Dan Ariely makes a distinction between market norms and social norms in the fourth chapter of Predictably Irrational. He touches briefly upon the way that employers mix their messages, dangerously breaking social contracts and making things about money when they are attempting to lead a socially-driven organization.

As the book documents, operating on market norms (i.e., thinking about the money I’m getting in return for the activity I’m doing) can damage productivity even when compensation is considered adequate. But worst of all, it can damage relationships when we assumed we were operating on higher terms–social norms like trust, reciprocity, and friendship. And we can’t mix the two: once we perceive that our efforts are being valued according to market norms, that’s the mindset we use for the entire interaction.

The next era of commerce will not be kind to organizations that depend on market norms, except as perhaps a back end, business-to-business protocol. For the most part, those things that are driven by competition, price, and data can be outsourced to computers and become a secondary function of people-facing businesses, businesses that use humans for those things humans are uniquely capable of accomplishing.

If you’re still using market norms to run your business, it’s best to start weeding them out now, before they relieve you of all your self-motivated people and leave you with half-hearted key-punchers.

Here are a few “do this, not that” guidelines for common business practices:

  1. Pay healthy salaries, don’t track hours. Some businesses require hour tracking, but to the extent that it’s possible your people shouldn’t identify the time they put in with dollar amounts. Doing so puts them in a market mindset: Am I getting enough money to be worth what I’m doing? Paying healthy salaries instead removes market questions from their minds, and has the potential to make the rare transformation of money into a social contract: the business is a community that takes care of your needs, rather than an employer compensating you for your activity. This is the genius behind Netflix’s policy to pay employees as much as they would pay to keep them: there’s no need for employees to ever negotiate salary or think about how much their work is worth, so they operate on a basis of trust and social contract rather than constantly competing with the employer for a fair wage. Even better if employees have direct deposit, where the money simply appears in their accounts as if by magic.
  2. Appeal to social contract, don’t talk about money. It should go without saying that you should never bring up the fact that you’re paying an employee, or use money as a bargaining chip for a change in behavior. They’re already aware that a threat to their position in the community is a threat to their livelihood. Focus on the social contract rather than the monetary transaction. Are they letting down their co-workers? Are they hurting their ability to make a difference in the organization? Talk about those things. If you have to mention money, it’s already a lost cause. (If they’re the ones bringing money into it, you might as well address their concerns–they’re already thinking in market terms. Take it as a form of feedback on your ability to keep market norms out of your business, and consider whether the issues raised might affect other people as well.)
  3. Make your people financially secure, don’t cut costs at their expense. If your employees have to be worried about paying the rent, covering bills, and eating, then they are already thinking about their jobs in terms of market norms. If you’re going to employ someone, make sure you’re ready to pay enough that they don’t have to be concerned about the basics of life. That includes health care, child care, and retirement. Ariely and James Heyman report that people who perceived themselves as paid inadequately lost as much as a third of their productivity at a very simple mechanical task (forget creative problem solving), and that’s without factoring in any worries about feeding their children. And if Costco is any indication, paying a living wage is a clear path to sustainable business.
  4. Share successes, don’t pay bonuses. This is a tricky one: Traditionally, bonuses are the way you share successes. But paying bonuses can create a clear line between the actions of an employee and the money, turning the action into market-regulated action rather than social-regulated action. There are different ways of accomplishing essentially the same thing. One is to reframe the concept of compensation entirely, as with my post on taxation. If employees interpret the amount they earn not as a payment from you but as something they are accomplishing with you, it may be possible to avoid activating market norms. Another way is to award the bonus as an in-kind gift–but this is fraught with pitfalls. Having the employee choose the gift causes the employee to think about the monetary value; choosing the gift for the employee puts one in danger of choosing something the employee doesn’t want or need; and having co-workers choose may invite comparison and market-norm thinking among the co-workers.
  5. Show loyalty, don’t dig moats. There are already a lot of financial obstacles to leaving a job. Creating new ones causes your people to think about the job in terms of their financial need instead of thinking about the social contract. Instead, you should make it as easy as possible for them to leave–and challenge yourself to convince them they shouldn’t. To the extent your people feel that they are with you by choice and not by necessity, they will be more likely to act on social norms instead of market norms.

It can be difficult to manage the financial needs of the business while operating on social norms, but undermining the social norms can quickly undo all the effort you’ve placed into creating them. If you start by thinking of your organization as a community, a family, or a nation, you will be on more solid ground. And when in doubt, leave the money out of it.

Motivating by Story: A Lesson from Theater

When you go to see a play, you may sit down with some assumptions about the people you see on stage. Among them may be an assumption that all those people want to be there.

A stage or a film set isn’t substantively different from the environment on a work team–the egos may be a little bigger, but if you’ve spent any time leading people, you’ve probably seen some big egos already.

One of the difficulties you might not expect lies in getting the cast on the same page. A director’s ability to interpret a script (which is tantamount to meeting the requirements of a project) depends on each actor individually working with that interpretation.

The irony is that for everyone to be on the same page, everyone has to be on a different page. Having a big-picture understanding of Hamlet finding out about and then exacting revenge for the murder of his father is a great perspective to have, but it’s not necessarily the story each actor is playing. Each person in the cast has to have a unique story in order to effectively and convincingly fulfill his duties.

To Laertes, Hamlet is a story about revenge–against his sister’s boyfriend for driving her to insanity and suicide. To Claudius, it’s a story about the ambition of a man who wouldn’t accept the cards dealt to him and worked to improve his station.

Similarly, one of the most important motivators in a team is not to see the big picture, but to see the small one–to see my story. It’s too easy to clock in and check out when I see what I’m doing as serving someone else’s story, and I’m just selling a bit of my life for a paycheck. To live fully within the work I’m doing, I have to understand: what am I doing right now to live my values?

Do you know the stories your people are telling? Do you know not only their value, but their values? Every level of organization has separate values: a nation, a business, a team, an individual. If you know their values, it will be easy to help them see their own stories in the larger story you’re telling together.

“Sure,” you might say, “I can show a key player how his individual story weaves into the big picture. But there are some people–interns, new hires, people from other departments or other companies–who tend to see their roles as small and interchangeable. And they are,” you may even say, “because they don’t have the specialization or depth of involvement to create unique value.”

There are two roles in Hamlet that are so famously insignificant and interchangeable that Tom Stoppard wrote a play about how insignificant and interchangeable they are: Hamlet’s friends, Rosencrantz and Guildenstern.

These two characters have no distinguishing features and serve only to move the plot forward. The actors could easily clock in and check out. But a good director doesn’t ignore even the smallest part, particularly if she already has star players locking down the major roles. The flavor of the end product can be disproportionately affected by these seemingly insignificant roles.

Ultimately it comes down to the same objective: Even these small, interchangeable players have to see what they do as the integral, project-defining work that it is. Losing even the least significant roles confuses and degrades the end product.

Could you have a successful product without it? Of course–it happens all the time. I’ve seen successful projects with much more significant roles that were poorly executed. But attention to the least of your team members, and the ability to integrate their stories seamlessly into the final product, is what separates a passable director from a great leader. Learning the individual stories of each of your players, and weaving each of them into the whole, can take a project that is unlikely to even be finished and turn it into an incredible success.