Innovation vs execution, and the myth of the big idea

Google’s recent acquisition of Nest for $3.2bn has raised some eyebrows, with some people questioning whether the valuation is sensible, and questioning how innovative Nest’s products really are. One comment I read claimed they did nothing that hadn’t been thought up years ago. In terms of the valuation, it is a high figure, and perhaps there was some amount of competition between Google and other suitors to drive it that high. In terms of Nest’s innovation, I’m not really familiar enough with their products to judge how groundbreaking they are, but I think the general complaint touches on something I’ve thought about for while: how the importance of “innovation” in technology coverage and comment is over-emphasised, and the importance of execution is under-emphasised. And also, misunderstanding about where innovation really is important in making a product.

The problem is this: There aren’t many good ideas that someone, somewhere won’t already have thought about and tried. The world is full of clever and motivated people looking for success. But ideas are intangible, and while intellectual property law gives them legal status as something you can buy, own and sell, you can’t make a world-conquering business from them without turning them into a real product, and that’s where the difficulty starts. Because building a successful product involves doing thousands of different things, in different areas, and doing them all as well as possible.

This is the execution of an idea, the act of turning it from an intangible thing into a reality, and getting it right is far, far harder than coming up with the original idea. Thomas Edison reputedly said that genius is 1% inspiration and 99% perspiration, which seems to sum up the problem. Edison’s approach certainly embodied this, as he would conduct a huge amount of experimentation to, for example, find the right material for light bulb filaments. Now, Nikola Tesla, who worked for Edison for a time, criticised this approach, claiming that Edison would waste huge effort on experimentally discovering things he could have determined through simple (to Tesla’s intellect) calculation. But even here we can see that Tesla is criticising Edison’s execution of his ideas, his wasteful perspiration, than the ideas themselves. He thought he could have executed on the idea better.

In coverage of technology, the press frequently plays up the original innovation, the “big idea”, behind a product. Nowhere is this more apparent than in the posthumous lionisation of Steve Jobs, who is referred to as the “genius” who invented the iPhone and the iPad. It’s a simple, heroic narrative, and therefore appeals to the news media,  whose job it is to turn complicated, messy reality into easily digestible stories that we can read during lunch. But the problem is, it’s wrong. The iPhone and iPad weren’t inventions, they were highly skilled executions of ideas that had long existed. Jobs’ success wasn’t coming up with the idea for a tablet computer, others, such as Microsoft, had done it long before Apple. His success was that he, with a large team of others, managed to execute on that idea in such a way as to make it a huge success.

Apple don’t exactly discourage the press narrative, of course. Their products are marketed as seamless, gleaming and indivisible units. Apple may talk a little about the fancy materials they employ, or a particular new feature or facet, but you aren’t encouraged to think of them as being a composite of parts, or the end result of a messy engineering process involving hundreds of people, or compromised by technical and economic trade-offs. They’re perfect and complete, supposedly, at least until the new model arrives. It’s easy to think of such products as springing whole into the imagination of a genius inventor, who needs only to put his vision down on paper and leave it to lesser mortals to build it.

Such geniuses may have occasionally cropped up throughout history, but they’re exceptional, in every sense of the word, and technology has long surpassed the ability of a single person, no matter their intelligence, to conceive of every part of a product like the iPhone. Instead, teams of very clever people work for a long time, doing lots of different things, and at the end, perhaps, a good product emerges. Or it doesn’t. And that’s the other source of faulty thinking about innovation. The one that lead to the iPad being dismissed by many people before it launched, because the idea had never worked in the past, so the idea itself was determined to be bad.

In fact, only the execution of the idea had been bad. Whether it was mistakes by those building them, bad marketing, limitations on the technology available, or simple bad luck, these earlier products failed. And many people attributed that failure on the inherent badness of the idea itself, and insisted that it would never be a success. In retrospect, we can see they were wrong, but that doesn’t help us unless we learn how to avoid making the same mistake in future.

To properly judge an idea, we need to differentiate between its inherent qualities and those that have become attached to it via cultural association. We shouldn’t forget or ignore the latter, they can still be very useful, but we should recognise that they can also change. Microsoft wanted to make tablet computers, but their idea of a tablet computer contained preconceptions formed by their experience with PCs and Windows that they couldn’t, or wouldn’t jettison, and that stopped them from creating a successful product.

Similarly, we also need to properly recognise those parts of a product that are a necessary consequence of its core idea, and those that are a result of a particular execution. The overwhelming majority will be of the latter type, which prompts us to question whether doing one or many of them differently would have resulted in a product being a runaway success or a miserable failure.

The matter of making particular decisions differently, or changing certain things, brings us to the final point about innovation in building technology. The process of executing on an idea itself involves coming up with a huge number of further ideas. These can include choosing a name and a colour palette, finding new solutions and workarounds for tricky technical problems, or clever ways to trade-off competing constraints. All of these ideas are examples of innovation, of a less flashy but more important type than coming up with the big idea. It is products that have a lot of this type of innovation that are truly groundbreaking, but it the type of innovation that gets less coverage in the press and recognition in larger culture. It tends to get put in the box labelled “engineering” and forgotten about.

The Oculus Rift is, I suspect, the next product that may vividly illustrate all of the above. Virtual reality was considered a winning idea, but the problems and failures of its actual manifestation in products eventually soiled it by association. Apart from a few die-hards, most people wrote it off as a joke. Now it looks like those same die-hards may get the last laugh, by building a product that does for VR what the iPad did for tablet computing, at least to some extent.

What impresses me most about Oculus is that they seem to really understand the importance of execution, sometimes to the frustration of people eager to get their hands on a headset. Palmer Luckey could have easily spun Kickstarter cash into a straightforward, commercialised version of his original prototype. It would have made money and been reviewed as the best attempt at VR that anyone had made. But it wouldn’t have been a world beater. It would have been low resolution, and made people sick, and had poor software support. It wouldn’t have changed people’s opinion that VR was a niche idea, with no prospect for broader success.

What Oculus have actually done is to expend the time, effort and (investor) money to try and get everything right. They’ve hired a large team, with expertise in hardware, software and business. They kept their promises to deliver dev kits, but they’ve resisted the urge to rush out a commercial product. Instead they’ve iterated on their technology and built up their infrastructure. They’re making considered trade-offs to ensure that no one aspect of the product, such as weight, resolution, field of view, latency, or price dominates to the detriment of the others. The results, by all accounts, are spectacular already, and they still say they have a long way to go.

Because Oculus have taken the time to get the execution right, it looks like the final product will be something very special indeed. It’s a telling contrast with many of the supposed Oculus competitors who get hyped up in the tech press occasionally. They tend to be small teams, working on prototype with a single unique selling point, whether its resolution, or FOV, that is supposedly 100x times better than the rift. The problem is, what sacrifices are they making in other areas of their products? And are they investing in the other things, like business organisation and infrastructure, that are key to success despite, not being an actual part of the product? It doesn’t appear that any of them are.

If Oculus succeed, it will be because of their ability to apply the lessons of other successful products and execute well on an idea, not because of the inherent brilliance of the idea or the genius of their founder. And the same lessons are important for anyone building anything in technology. To focus less on coming up with the world-beating idea, and instead on collaborating with others to build something world-beating a single step at a time.


Posted

in

by

Tags:

Comments

2 responses to “Innovation vs execution, and the myth of the big idea”

  1. Troy Avatar
    Troy

    Just a random drive-by comment:

    1) comparing Microsoft’s tablet efforts of 10+ years ago to the first iOS devices is like comparing 1860 telegraph to 1960s touch-tone phones, or horse-pulled carriages to the Model T.

    Apple put the TOUCH in touch devices, that was their key innovation in the tablet space, and it totally obliterated the previous developments and direction of the space.

    And to get touch working correctly, they had to reinvent what the table UX was about — smooth scrolling, scaling, and making the finger the primary UI driver.

    This in turn greatly benefited from, in not requiring, multi-touch sensor layer and a very performant graphics stack, plus the underlying API for app developers to cleanly execute the new UI behavior in.

    As for the rift, I had the privilege to work for the main pioneer of VR back in the 1990s, a company located up in Leicester.

    Theres’s a lot of usability issues with HMDs that only become apparent after sustained use — elementary things like hygiene of having something attached to your face for a sustained period of time.

    I seriously doubt the Rift is going to go anywhere. Valve’s new optical head tracking approach does look to be a critical innovation, allowing low-latency and drift-free head-tracking, something that today costs $2000+ via Polhemus magnetic trackers.

    btw, I love your feature request for C# extension methods in Dart. You were right, but I think you went a bridge too far by looking at iteration. The simplest use case is to just turn doFoo(a) to a.foo(), which alone justifies the feature.

    In ObjC these are called ‘category methods’ and were the language’s best feature.

  2. Jon Avatar

    Hey Troy,

    I agree that Apple’s focus and innovation around touch-based interaction was a key part the iPhone and iPad’s success. But again, finger-based touch UI wasn’t without precedent. It was just that Apple executed on the idea better than anyone ever had before to the point that, yes, they made everything else look incredibly primitive in comparison. Whether the big idea is “tablet-form factor” or “touch-based” or both, what matters is how well you can execute on it, by reinventing the UI, creating good APIs, etc.

    My point was that that kind of work tends to be overlooked by the press when praising the “magic” feel of these devices. That magic isn’t the result of a single, genius idea, but many years of hard work and less obvious innovation to turn the overall idea into a reality.

    As for the Rift, I guess we’ll have to see, but aren’t all the usability issues, such as hygiene, ultimately just examples of the same kind of issues that touch-based UI faced? People said fingers were too big, dirty and imprecise for input, and didn’t play well with traditional UI tropes. Making HMDs a mainstream proposition surely just requires the same kind of innovation that made touch viable.

Leave a Reply

Your email address will not be published. Required fields are marked *