The CEO's Dilemma: Balancing Profit and Purpose
There is a great scene at the beginning of the very first episode of HBO’s Silicon Valley series that perfectly captures a certain cynical hypocrisy of the startup ecosystem.
The setting is an outdoor launch party of some random company. The high budget stage set up and beautiful garden landscape is humorously contrasted with the sparse attendance of a crowd presumably made up of other startup founders, mostly just there to network for their own ventures and drink for free. The crowd is barely paying attention as a founder (the host, we assume) ascends the stage to announce their new product. The founder states, with a straight face and not a hint of irony, how their company is “making the world a better place through…” what follows is mostly incomprehensible technical jargon about code reusability.
Like much of the content in the show, it’s funny, and cringe-worthy, because it tracks a certain reality in the startup world. I’ve attended parties like it a few times and heard many equally vacuous claims of changing the world for the better through one incremental technological breakthrough or another.
The episode aired in April of 2014, nearly a decade ago. While the world has changed a lot since then, the tendency of startup founders and CEOs to make such grandiose claims about improving our world through their technology without either irony or shame stubbornly persists.
And yet, some people are making the world a better place, in very real and practical ways. Some of those people, a minority to be sure, are actually startup CEOs and their companies. But they aren’t doing it through launching this or that techno marvel. They’re doing it through their values, principles, and leadership, in the way that they design and build their organizations, and with careful attention to the needs of their employees, their customers, and the communities in which they operate.
And it’s a good thing, too, as the voices of the latest generation are rising to a deafening crescendo for some real, material changes in how things are run and who gets to benefit from any prosperity that comes from running them. This latest generation, entering, or already embedded in, the workforce, will increasingly exert an influence on trends and tastes, but also on public policy.
Don’t Believe The Hype
The average age of a CEO, including large corporations, is said to be a bit over 50. Separating out the startups, the average age of a CEO is only slightly lower at about 45. So we’re talking about Generation X here. Generalizations about social attitudes based on when people were born tend to be rather gross and imprecise. Still, since these temporal groupings were coming of age around the same time and exposed to the same general media and cultural background noise, it can give us some insight into how a given person will generally perceive the world around them, and what conclusions they may draw using those perceptions.
Consider that millenials are in their mid-30s now, and have already been in the workforce for ten years. Some are beginning to be promoted to management already. They started working in the 2010s, and were only teenagers during the Great Recession. They didn’t lose their houses or struggle to find a job during that tumultuous period. They were in college or high school, so they certainly heard about it. But the actual material effects were probably somewhat remote.
The average CEO by contrast started watching television as a child during the Cold War, perhaps even the Viet Nam conflict. As they were starting to date and learning to drive, the regulatory landscape was shifting from the New Deal of the 1930s and Johnson’s Great Society of the 1960s toward the deindustrialization and offshore manufacturing of the 1980s, with ever greater amounts of privatization and deregulation.
By the time they were entering the workforce, the Berlin Wall had fallen, the USSR had collapsed, and the cold rational logic of Wall St was announced by all media outlets as universally triumphant. Margaret Thatcher’s famous statement “there is no alternative” had finally come true. Or so it was presumed.
One very real impact of the changes in the regulatory environment in banking and finance during the 1980s and early 1990s was a major speculative bubble in Asia. When compared to the dot com boom and the housing boom that followed, the asian financial crisis seems like a quaint distant memory. But at the time it was a big deal to those who were paying attention.
I happened to be paying attention, as a student of political economy, recently graduated and wandering the streets of San Francisco trying to find myself. The news in the Fall of 1997 was exhilarating for any student of economics.
The Asian financial crisis began in July 1997, when Thailand stopped supporting its currency, the baht. This led to a rapid depreciation of the baht, causing investor confidence to plummet and triggering a contagion that quickly spread across the region. Currencies in neighboring countries such as Indonesia, Malaysia, and South Korea also fell drastically.
In response, the US government provided emergency funding through the International Monetary Fund (IMF) to several countries affected by the crisis. The fallout from the Asian financial crisis helped to fuel the dot-com boom in the United States via an influx of capital into the US economy as investors sought safer havens for their money. This influx of capital helped to fuel a surge in stock prices and set the stage for the dot-com boom.
Driven by the proliferation of the internet and fueled by a surge of investment capital, tech startups and established companies alike experienced unprecedented growth and stock price appreciation. However, as investors became increasingly skeptical of the high valuations and unprofitable business models of many tech companies, stock prices plummeted and many startups went bankrupt.
Our average CEO was a younger worker during those years. While the Asian crisis may have been vaguely followed in the newspapers, the dot com boom and bust was formative. It was a struggle for most of us to get through the years from 2001 to 2003. But by 2005 something very interesting was happening. Real estate seemed to be going bananas, and the entire global economy was propelled by the flurry in buying, flipping, renovating, and investing in houses and other property.
“No One Saw It Coming”
The Great Recession was different in the way that it was covered in the media, and in the ways that the collapse reflected public opinions about the relationship between finance and technology. The bizarre period we are currently living through has its roots in that crisis, so it is best we take a brief close look.
The housing bubble had been fueled by a combination of lax lending standards, low interest rates, and speculation, leading to a rapid increase in home prices and a surge in subprime mortgages (you may recall that subprime is bank-speak for “very risky”). Subprime was popular because the mortgage industry had already run out of credit-worthy borrowers, and so it was a way for lenders to continue lending and, of course, charging fees. Wall St banks’ pressure to continue growing at all costs drove the creation of ever more bizarre and dangerous financial derivatives on top of this speculative bubble.
Interest rates were raised gradually between 2004 and 2006 in an effort to slow the growth of the US economy and prevent inflation from getting out of control. However, as interest rates rose, subprime homeowners found themselves unable to afford their mortgage payments, leading to a wave of defaults and foreclosures.
The collapse of the housing market had a ripple effect throughout the US economy, as banks and other financial institutions that had invested heavily in mortgage-backed securities faced massive losses. This led to a liquidity crisis and a freeze in credit markets, as banks became reluctant to lend to one another or to other businesses and individuals. The collapse of the US housing market triggered a wider economic downturn, as businesses and consumers reduced spending and investors pulled back on investment. What followed was the longest and deepest recession since The Great Depression. Our average CEO would likely have been deeply shaped by those events.
What struck me the most about the coverage of the aftermath was the repeated refrain in the mainstream media through 2007 and 2008 that “nobody saw this coming.” There appeared to be near universal shock and disbelief that the housing bubble had collapsed or even that it could collapse at all. It is sometimes difficult to discern whether politicians are repeating the media or the other way around. But either way, the notion that this bubble’s pop caught policy makers and central bankers completely by surprise because no one could see it coming was both alarming and bizarre.
Mostly because it’s not true.
There were actually a great many voices who warned throughout 2005 and 2006 that a collapse was immanent. The Economist magazine ran a piece in 2005 with the subtitle reading, “the worldwide rise in house prices is the biggest bubble in history. Prepare for the economic pain when it pops”. In an op-ed in August of 2006, NYT columnist Paul Krugman bluntly laid out the prospect of a major and prolonged recession and expressed dismay that most economists and pundits were in denial about the prospects for recession. And Doug Henwood, writer for the century-and-a-half-old political magazine, The Nation, warned us that as the housing bubble starts leaking, a major financial collapse could result. And those are just a few warnings of many that were being published from 2005 to 2007.
So where did this notion that “nobody saw it coming” actually come from? And why should CEOs be concerned about it?
The answer has to do with the notion of popular wisdom, and how public perception of events shape the direction of markets and the economy. To be an effective leader, one needs to be able to read the tea leaves, look around corners, and see the wood for the trees. And not be taken in by waves of overly optimistic or pessimistic group-think.
“Nothing Has Changed”
There are two common myths in the business world (and that therefore extend to startups) that need to be challenged by CEOs. Any chief executive who is able to see these two paradigms clearly, and either push back on them or ignore them, sets themselves apart from the crowd as a modern, critical-thinking leader.
The first is the myth of eternal truths. By eternal truths what I mean is that certain principles, practices, or dynamics in human domains (the economy, society, culture, etc.) are treated as if they are a natural, permanent feature of material existence. Concepts like money, markets, intellectual property, corporate law, and others are treated by the business community as if, like gravity or entropy, they simply exist in nature without having been created by humans.
This fallacy is often perpetuated through comparisons between ancient times and the present day. Take for instance the claim sometimes made that joint ventures in fifteenth-century Genoa are akin to modern-day venture capital investments. While comparisons across different eras can be illuminating, they are not without risk. By assuming that attitudes and norms from the past can be easily transposed onto our current reality, we risk oversimplifying and distorting our understanding of history. This is particularly true when it comes to human and organizational forms, which have undergone significant transformations over time.
Unfortunately, this myth of eternal truth continues to be invoked by those who seek to maintain the status quo. By appealing to the supposed ahistorical permanence of a particular concept or practice, they argue that any attempts to change things are doomed to fail. Such arguments can be used to resist reforms in a range of areas, from social justice to economic policy.
For example, advocates of laissez-faire capitalism might invoke the myth to argue that any attempts to regulate markets are misguided, since markets have always been self-regulating. In reality, however, human and organizational forms are far from static. Rather, they are constantly evolving in response to changing social, economic, and political conditions.
“Everything Is Different Now”
The second is the illusion of teleology. Teleology is a philosophical concept that refers to the idea that there is a purpose or goal inherent in the natural world or in human behavior. It suggests that things exist or happen for a particular end or purpose, and that this purpose is an inherent feature of the thing itself. In other words, teleology assumes that there is a pre-determined goal or end-point of history towards which things are naturally directed.
According to this view, any attempts to resist this forward momentum are not only futile, but also morally suspect. After all, who would want to stand in the way of progress?
However, the idea of the march of progress is deeply flawed. For one thing, it assumes that progress is a linear and unidirectional process, when in reality it is much more complex and contingent. Moreover, it often serves as a cover for the imposition of a particular agenda or ideology, by presenting it as the inevitable outcome of historical forces.
For example, advocates of technological innovation might claim that concerns about the social and environmental impacts of new technologies are misplaced, since progress always entails some level of disruption and upheaval. We have certainly seen plenty of that argument in recent years.
By invoking the idea of the march of progress, these arguments seek to foreclose debate and marginalize dissenting voices. They suggest that there is only one right way to move forward, and that any objections or criticisms are mere foot-dragging. This is a dangerous attitude, as it can lead to a lack of accountability and a disregard for the diversity of perspectives and experiences that make up our society.
While the idea of eternal truths of business and that of the inevitable march of progress are on the surface contradictory, or even opposites, they are often used interchangeably. Those in power find no contradiction there, and deploy either myth with equal destructive force depending on which ideology is likely to support their goals at the time.
Both myths were deployed interchangeably throughout the multi-year bull market run leading up to the collapse of Lehman Brothers in 2007, at the high water mark of the housing bubble. It enabled everyone to participate up to that point in the feverish wave of speculation by believing as hard as they could that the good times would last forever, and to avoid uncomfortable questions being posed by party poopers like Krugman, Henwood, and the Economist.
Then the position that “nobody saw it coming” was used as a cover or excuse by those same bull market cheerleaders who gained from both the boom and the collapse (remember “too big to fail”?). Now, as we look watch the current economic landscape for signs of riches or ruin, the temptation to believe or both of these myths is increasingly strong. That’s a temptation to be avoided at all costs. Things are always more complex than they at first appear when you look beyond the headlines.
“The Only Way”
One way the myth of eternal truth interferes with the work of actually making the world a better place is that it is applied to corporate structure and practices of management and organization. I’ve written extensively elsewhere that today’s management culture is a product of the industrial models of organization that emerged during the early to mid twentieth century. Those mindsets are certainly entrenched in boardrooms around the world, but by no means permanent or eternal.
For starters, there is a pervasive belief in our culture that hierarchical organization is the only efficient way to organize the production of goods and services. Some take it even further, proposing that hierarchical thinking is natural and that any attempt to organize groups of people in more equitable and decentralized structures, while laudable if perhaps quaint and naive, will only lead to ruin, chaos, and disappointment for everyone.
In the face of innovations like Frederic Laloux’s Teal organizations or Brian J. Robertson’s Holocracy, they wave their hands vaguely toward the past, insisting that we’d never have had the Great Pyramids of Giza or the Roman roads and aqua-ducts if people hadn’t been organized hierarchically. But there is little hard scientific evidence for this view. And in fact, there is a growing body of archeological research suggesting quite the opposite about humans and our capacity to self-organize.
In the book "Dawn of Everything," anthropologist David Graeber and archeologist David Wengrow argue that the research points to the existence of diverse and complex social structures. Many sites around the globe have been found dating from neolithic to bronze age periods, some that supported tens of thousands of people for hundreds of years, with no evidence of a monarch or any other kind of ruling class. The book’s exhaustive descriptions of research digs and analysis in sites in neolithic Britain, Mesopotamia, pre-colonial North and Meso-America, and periods in ancient Egypt, force us to rethink our assumptions about the prevalence of hierarchy in ancient times.
It is possible to build structures of human organization that are vast and complex within a variety of different configuration options. Be careful about what “rules” you take for granted. Those things tend to bite you back.
The myth of teleology will sometimes fool us into believing that this current way of organizing our world is the result of a coherent set of choices and tradeoffs that will naturally lead to the best possible outcomes. It’s important to appreciate that our current economic system is very new, and was neither founded nor designed. It emerged slowly out of a particular set of historical circumstances, often through periods of brutal and bloody conflict, until it became dominant only relatively recently.
But there is no reason to believe that this economic system is the only one that could possibly work (setting aside whether or not we believe it really works very well at all), or that it will be the way we do things in the future. Again, those who benefit the most from the way things are have an interest in convincing the rest that it’s the way things must be.
How Did We Get Here?
All of which begs the question, why do our modern organizations look the way we do. Is this the only way to do things? Can we do better and if so, what would that look like?
Despite the increasing evidence that people organized sophisticated large scale and non-hierarchical organizations all throughout our ancient past, at some point around 2000 BCE, the monarchical state became the dominant form of organization on the planet. And from that turning point until about 150 years ago, the way the vast majority of people on the planet lived, worked, and took care of themselves and their families was relatively unchanged.
Among the most rigidly hierarchical of all were the European kingdoms of the late medieval period. Feudalism was the social and economic system that dominated medieval Europe. At its core was a hierarchical structure where powerful lords, known as vassals, held lands granted to them by their monarch in exchange for loyalty, military service, and other duties. Vassals would then divide their land into smaller parcels and grant them to lesser lords, who in turn pledged loyalty and service.
At the bottom of the social ladder were the serfs, who worked the land and were tied to it, unable to leave without permission. Feudalism provided a sense of order and stability in a time of widespread violence, but also reinforced social inequality and limited individual freedom.
It began to decline in the 14th century as centralized monarchies gained power and economic changes transformed Europe. But it would take almost 500 years more for modern versions of capitalism to emerge.
European monarchies were seeking new trade routes to the East to bypass the monopoly of Arab and Venetian merchants. They also sought to find new sources of valuable commodities such as gold, silver, and spices. The development of new ships and navigational tools, such as the astrolabe and compass, made long-distance travel safer and more efficient. And the Renaissance sparked a renewed interest in classical knowledge and a desire to explore the world and expand human knowledge.
These factors, among others, contributed to the Age of Exploration and led to the discovery of new lands, the establishment of trade routes, and the colonization of much of the world by European powers.
Joint stock companies played a significant role in the colonization of the world by European powers, particularly the Dutch East India Company. They were created to facilitate the exploitation and extraction of resources from the colonies. They allowed private individuals, mostly rich landowners and merchants, to invest in these endeavors, spreading the risk among many investors rather than leaving it on the shoulders of one person.
Joint stock companies are often referred to as the pre-cursor to the modern corporation. But that is a bit of a leap. They would probably be closer aligned to the temporary partnerships that occur in modern real estate development project or to import and export businesses, since they weren’t actually engaged in production and didn’t employ many people.
From the fourteenth to the eighteenth century, then, the vast majority of the food and products consumed by the population or European serfs continued to be produced at home on the family farm. Some items were made by master craftsmen in the cities, but these were largely luxury goods sold mostly to the wealthy nobility.
The next major development occurred in England during the seventeenth century, one that started slow, but would dramatically accelerate during the nineteenth century and lead to our current economic system: the enclosure of the commons.
The enclosure movement in England refers to the legal and economic process of consolidating small landholdings into larger farms through the privatization of common lands. The consequences of the enclosure movement included the displacement of small farmers and rural communities leading to a mass exodus of labor into the cities. It also facilitated the concentration of land ownership and wealth in the hands of an ever shrinking number of powerful landowners.
The industrial revolution in England was closely connected to the enclosure movement. The enclosure movement provided the necessary preconditions for the growth of commercial agriculture, which produced the raw materials necessary for the expansion of industry. As small farmers were displaced from their land, many were forced to seek work in the growing factories and mills of the industrial revolution.
"The Making of the English Working Class" by E.P. Thompson focuses on the experiences and agency of working-class people themselves rather than the actions of the elites. Thompson's book challenges traditional views of the working class as passive victims of historical forces and instead portrays them as active agents of their own destiny, shaping their own lives and the course of history.
Keep in mind that during the eighteenth and nineteenth centuries in England, the vast majority of people, who were predominantly engaged in agricultural pursuits, did not have access to democratic institutions. The emergence of industrialism during this period was accompanied by harsh working conditions and a general disregard for the well-being of workers by the ruling elite. This resulted in extreme exploitation of the working class, as policy-makers of the time prioritized the interests of industrial capitalists over those of the laboring masses.
Thompson saw the enclosure movement in England as a process of dispossession that involved the privatization of common lands and the concentration of land ownership and wealth in the hands of a new class of capitalist farmers. He emphasized the importance of working-class culture, including collective rituals, traditions, and symbols, as a means of resistance and identity formation, and explores the role of popular movements such as the Luddites and Chartists in challenging the dominant power structures of their time.
Around the same time period that the industrial revolution was underway in England the, westward expansion was underway in North America. There is a lot of mythology about the American pioneer, driven by the prospect of finding new and fertile land to work on. But while settlers initially established farms, many of them were more like speculators, constantly on the move and looking for new opportunities to profit. They would often buy up forest land, clear it, and then sell it to newcomers, thus perpetuating the cycle of westward expansion.
The Gold Rush in California in the mid-19th century was the ultimate example of this speculative fever. Thousands of people flocked to the region from 1848 onward, hoping to strike it rich by finding gold or silver in the rivers and hills. Although some did find gold and became wealthy, many others failed and were forced to return home empty-handed. It’s frequently overlooked that it was actually the financiers and the pick-and-shovel crowd who made off like bandits during the Gold Rush.
Throughout the nineteenth century in both Europe and the United States most people still worked in agriculture rather than in cities. Jobs were dirty, dangerous, and precarious. Companies were typically run by individual owners or small partnerships, with relatively little regulation or oversight from the state. These firms tended to be relatively small in size and scope, focused on local or regional markets and typically operated in a single industry or sector.
Bubble Bubble, Toil and Trouble
Management structures in these companies were often highly centralized, with decision-making authority concentrated in the hands of the owners or partners. There was little separation between ownership and control, and decisions were often made based on the personal preferences and interests of the owners rather than a formalized process of corporate governance.
In the late nineteenth and early twentieth centuries, the modern corporation was born, in the form of trusts. A classic example is the well-documented arrangement between Standard Oil baron John D. Rockefeller, steel magnate Andrew Carnegie, and legendary financier J. P. Morgan, who conspired to form the U.S. Steel Corporation in 1897, covering 60 percent of US steel production, to largely get around rules that were in place for regular companies. These formations, originally known as trusts, are the reason anti-trust laws are so named today.
At the start of the nineteenth century, financiers were thus the dominant force in corporate management, exerting significant control over the operations of many companies. However, following the discrediting of the financial sector in 1929, and the implementation of Roosevelt's New Deal policies, a new class of professional corporate managers emerged. These managers went on to oversee major industrial firms throughout World War II and into the 1970s.
The period from WWII to the 1970s is known as the Golden Years because it was a time of great economic growth and social progress in many parts of the world. In the aftermath of WWII, many countries were focused on rebuilding their economies, and this led to an unprecedented period of prosperity.
The baby boom that followed the war also contributed to a surge in consumer spending, which in turn drove economic growth. This era saw the rise of the middle class, and many families enjoyed a standard of living that was previously unthinkable. The development of the first computers, the birth of the space age, and the widespread use of television were just a few of the many technological innovations that transformed society. The Civil Rights Movement also gained momentum during this time, leading to significant gains in racial equality and social justice.
Real wages (adjusted for inflation) in the Golden Years, from the end of WWII to the 1970s, were generally keeping close to productivity and profit rates. It’s not a coincidence that this was also the period of highest participation in and strongest political clout of industrial labor unions. Starting in the 1970s, though, high finance gradually regained its influence, particularly during the Reagan and Bush administrations. It gained even more power during the Clinton years, with Treasury Secretary Rubin and others working to dismantle the New Deal banking administration as much as possible.
With the growing power of finance capital, there was a renewal of the old speculative methods of capital accumulation from the pre-1929 days. Now, instead of trusts being dominant, mergers and acquisitions, moving manufacturing offshore, stock buybacks all replaced real investment in the productive economy. This led to a flood of inexpensive money being injected into the economy, which initially inflated real estate bubbles across East Asia, as we previously discussed, followed by a significant collapse.
Real wages have not kept up with productivity gains since the 1970s. This has led to a growing income gap between the wealthiest Americans and the rest of the population that persists to this day.
The two themes of eternal truths and teleology were used throughout all of this period to suppress dissent and dampen arguments to change how things are done. In every case when the status quo for common people was being disrupted by change, resistance to that change faced the argument from ruling elites that was framed in the myth of teleology: “You can’t stand in the way of progress.” In in other times when common people fought to try to change the status quo for the better, the refrain from elites was in the language of the myth of eternity: “This is the best it is ever going to get, and you must get used to the way things are.”
We must reject these twin myths that stand in the way of making the world a better place. There is no set of eternal truths in the human created world of business. All concepts of money, market, company, profit, scarcity, and so on, are all socially constructed from the minds of people, rooted as they are in their social context. That doesn’t mean those concepts aren’t valuable, that they don’t serve a purpose. But they are ephemeral, abstract, based purely on tacit agreement and trust in the systems which hold them in place.
Likewise, there is no inevitable march of progress. There is only change. All social, political, and economic change is the result of the aggregate of actions taken by groups and individuals, sometimes in struggle with one another, and again embedded within a unique historical and social context. There is always room for possibility, for change, for new ways of being, seeing, and thinking. We need only to muster the courage to stand by our values and our principles and to inspire others to do the same.
Stand For Something
Our average CEO thus grew up during the most intense period of financial “group think” in history. Over the last few decades, technology and finance have become increasingly intertwined. In fact, the integration between these two industries has become so seamless that it can be difficult to distinguish one from the other. And so the thinking prevalent in one tends to strongly influence the thinking of the other, and not always to positive effect.
Finance has a tendency to exacerbate bubbles, as we’ve seen, which can cause significant challenges for technology CEOs who are trying to make decisions based on values and purpose. While the integration of technology and finance has undoubtedly brought about many benefits, it is important for industry leaders to remain vigilant against the risks that arise from this relationship. CEOs of private firms who strive to make a positive impact on the world will encounter significant pressure from venture capital or private equity, while those steering public companies face the same pressure from Wall St.
Investors are driven to maximize profits in the shortest possible time frame, often leading them to push for decisions that prioritize profitability over broader ethical or environmental considerations. But this short-sighted approach can make it challenging for CEOs to achieve their goals of making a meaningful and lasting impact on society, and may force them to make difficult trade-offs between short-term financial gains and the long-term well-being of their stakeholders.
Yet, an increasing number of CEOs today express a genuine concern for social and environmental issues, and a desire to use their companies as a force for good. Many of these executives are hesitant to speak out publicly about their values and beliefs, fearing backlash from shareholders, customers, or the media. This reticence may stem from a perception that discussing social issues is outside the traditional purview of business leaders, or from concerns that taking a stand could be seen as politically motivated or divisive.
Despite the recent wave of layoffs in the technology sector, the overall job market remains extremely hot, with unemployment historically low at around 3.4%. GDP growth is still strong. In 2022, a record number of people quit their jobs. The previous record was set the year before. So, while the so-called “Great Resignation” was big news for a while in 2021, the reality is that it was actually bigger in 2022, and has not really stopped or even slowed down.
But optimism about the economy doesn’t seem to be the main reason why people are leaving their jobs. Instead, the main reasons given include: the desire for better work-life balance, an increased focus on personal priorities, burnout and stress, and general dissatisfaction with a job or employer. In short, the pandemic really rocked people’s expectations about work, and millions have responded either by changing their jobs. People are looking for purpose and meaning in their work now more than ever.
In recent years, there has been a shift towards more modern methods of product development such as agile, lean, and design thinking. These methodologies prioritize flexibility, speed, and customer-centricity, allowing teams to work more efficiently and effectively. But they are also totally in line with workers' values of independence, autonomy, and purpose. By empowering workers to take ownership of their work and to contribute meaningfully to the development process, these methodologies are helping to create a more engaged and motivated workforce.
It is critical for CEOs to create an environment that allows for employees to be as much their own bosses as possible. Appeals to the myth of eternal truths, such as speculative forces always have the final say leaving us powerless to challenge them, or appeals to the myth of teleology, such as there is no way to stop the juggernaut of tech-fueled speculative bubbles, must be resisted at all costs.
Only leaders who are firmly committed to their values and principles of making the world a better place, who are prepared to take risks and maybe even make some enemies on Wall St or in Silicon Valley, will be able to create the kinds of companies that the new generation will want to work for. That worker support for, and attraction to, companies that are really working to make the world a better place will form the base on which that better world can be built.
References
Braudel, F. (1995). A history of civilizations. Penguin Books.
Brenner, R., Glick, M., & Theodore, N. (2006). The economics of global turbulence: The advanced capitalist economies from long boom to long downturn, 1945-2005. Verso.
Graeber, D., & Wengrow, D. (2021). The dawn of everything: A new history of humanity. Penguin Press.
Hobsbawm, E. (1996). The age of capital: 1848-1875. Vintage Books.
Thompson, E. P. (1966). The making of the English working class. Vintage Books.
—
Ready to transform your organization?
It can be hard to figure out where you should focus your efforts. That’s why we’ve designed a diagnostic that pinpoints the areas of your organization with the most untapped potential. Our clients love it. It asks you a series of questions that we’ve perfected over the last four years of development, and gives you a simple report so you can get clear about where you should put your energy. Reach out to us if you want to give it a try.