Search

Why Software Won't Eat The World. Taking a software centric view, while… | by Greg Satell | Sep, 2020 - Medium

lemperbon.blogspot.com
Greg Satell
Sep 26 · 5 min read
Image for post
Image: Flickr JD Lasica

In 2011, technology pioneer Marc Andreessen declared that software is eating the world. “With lower start-up costs and a vastly expanded market for online services,” he wrote, “the result is a global economy that for the first time will be fully digitally wired — the dream of every cyber-visionary of the early 1990s, finally delivered, a full generation later.

Yet as Derek Thompson recently pointed out in The Atlantic, the euphoria of Andreessen and his Silicon Valley brethren seems to have been misplaced. Former unicorns like Uber, Lyft, and Peloton have seen their value crash, while WeWork saw its IPO self-destruct. Hardly “the dream of every cyber-visionary.”

The truth is that we still live in a world of atoms, not bits and most of the value is created by making things we live in, wear, eat and ride in. For all of the tech world’s astounding success, it still makes up only a small fraction of the overall economy. So taking a software centric view, while it has served Silicon Valley well in the past, may be its Achilles heel in the future.

The Silicon Valley Myth

The Silicon Valley way of doing business got its start in 1968, when an investor named Arthur Rock backed executives from Fairchild Semiconductor to start a new company, which would become known as Intel. Unlike back east, where businesses depended on stodgy banks for finance, on the west coast venture capitalists, many of whom were former engineers themselves, would decide which technology companies got funded.

Over the years, a virtuous cycle ensued. Successful tech companies created fabulously wealthy entrepreneurs and executives, who would in turn invest in new ventures. Things shifted into hyperdrive when the company Andreessen founded, Netscape, quadrupled its value on its first day of trading, kicking off the dotcom boom.

While the dotcom bubble would crash in 2000, it wasn’t all based on pixie dust. As the economist W. Brian Arthur explained in Harvard Business Review, while traditional industrial companies were subject to diminishing returns, software companies with negligible marginal costs could achieve increasing returns powered by network effects.

Yet even as real value was being created and fabulous new technology businesses prospered, an underlying myth began to take hold. Rather than treating software business as a special case, many came to believe that the Silicon Valley model could be applied to any business. In other words, that software would eat the world.

The Productivity Paradox (Redux)

One reason that so many outside of Silicon Valley were skeptical of the technology boom for a long time was a longstanding productivity paradox. Although throughout the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year, productivity growth had diminished during the same period.

In the late 90s, however, this trend reversed itself and productivity began to soar. It seemed that Andreessen and his fellow “cyber-visionaries were redeemed. No longer considered outcasts, they became the darlings of corporate America. It appeared that a new day was dawning and the Silicon Valley ethos took hold.

While the dotcom crash deflated the bubble in 2000, the Silicon Valley machine was soon rolling again. Web 2.0 unleashed the social web, smartphones initiated the mobile era and then IBM’s Watson’s defeat of human champions on the game show Jeopardy! heralded a new age of artificial intelligence.

Yet still, we find ourselves in a new productivity paradox. By 2005, productivity growth had disappeared once again and has remained diminished ever since. To paraphrase economist Robert Solow, we see software everywhere except in the productivity statistics.

The Platform Fallacy

Today, pundits are touting a new rosy scenario. They point out that Uber, the world’s largest taxi company, owns no vehicles. Airbnb, the largest accommodation provider, owns no real estate. Facebook, the most popular media owner, creates no content and so on. The implicit assumption is that it is better to build software that makes matches than to invest in assets.

Yet platform based businesses have three inherent weaknesses that aren’t always immediately obvious. First, they lack barriers to entry, which makes it difficult to create a sustainable competitive advantage. Second, they tend to create “winner-take-all” markets so for every fabulous success like Facebook, you can have thousands of failures. Finally, rabid competition leads to high costs.

The most important thing to understand about platforms is that they give us access to ecosystems of talent, technology and information and it is in those ecosystems where the greatest potential for value creation lies. That’s why, to become profitable, platform businesses eventually need to invest in real assets.

Consider Amazon: Almost two thirds of Amazon’s profits come from its cloud computing unit, AWS, which provides computing infrastructure for other organizations. More recently, it bought Whole Foods and opened its first Amazon Go retail store. The more that you look, Amazon looks less like a platform and more like a traditional “pipeline” business.

Reimagining Innovation For A World Of Atoms

The truth is that the digital revolution, for all of the excitement and nifty gadgets it has produced, has been somewhat of a disappointment. Since personal computers first became available in the 1970s we’ve had less than ten years of elevated productivity growth. Compare that to the 50-year boom in productivity created in the wake of electricity and internal combustion and it’s clear that digital technology falls short.

In a sense though, the lack of impact shouldn’t be that surprising. Even at this late stage, information and communication technologies only make up for about 6% of GDP in advanced economies. Clearly, that’s not enough to swallow the world. As we have seen, it’s barely enough to make a dent.

Yet still, there is great potential in the other 94% of the economy and there may be brighter days ahead in using computing technology to drive advancement in the physical world. Exciting new fields, such as synthetic biology and materials science may very well revolutionize industries like manufacturing, healthcare, energy and agriculture.

So we are now likely embarking on a new era of innovation that will be very different than the digital age. Rather than focused on one technology, concentrated in one geographical area and dominated by a handful of industry giants, it will be widely dispersed and made up of a diverse group of interlocking ecosystems of talent, technology and information.

Make no mistake. The future will not be digital. Instead, we will need to learn how to integrate a diverse set of technologies to reimagine atoms in the physical world.

Greg Satell is an international keynote speaker, adviser and bestselling author of Cascades: How to Create a Movement that Drives Transformational Change. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, GregSatell.com and follow him on Twitter @DigitalTonto

Let's block ads! (Why?)



"eat" - Google News
September 26, 2020 at 07:40PM
https://ift.tt/30aIE9r

Why Software Won't Eat The World. Taking a software centric view, while… | by Greg Satell | Sep, 2020 - Medium
"eat" - Google News
https://ift.tt/33WjFpI
https://ift.tt/2VWmZ3q

Bagikan Berita Ini

0 Response to "Why Software Won't Eat The World. Taking a software centric view, while… | by Greg Satell | Sep, 2020 - Medium"

Post a Comment

Powered by Blogger.