Why Silicon Valley Had to Reinvent What Already Existed (Part 2)
The Innovation the World Was Not Ready For
In 1995 a Mercedes S Class crossed the German Danish border at highway speeds with no one touching the steering wheel. Two decades later Silicon Valley would spend tens of billions of dollars trying to recreate what Ernst Dickmanns had already achieved.
When Dickmanns completed the Odense highway journey that year, his Mercedes Benz S Class prototypes had crossed international borders using only visual perception, mathematical prediction and custom control logic. There was no lidar. There were no high definition maps. There was no cloud computing. There were only cameras and algorithms solving problems that industry would rediscover twenty years later.
His success arrived before the world was ready to receive it. The automotive industry in 1995 celebrated incremental victories such as better airbags, anti lock brakes and traction control. Autonomy was viewed as a scientific curiosity rather than a business priority. There was no regulatory framework for testing self driving cars and no modern venture capital ecosystem to fund it.
When the EUREKA PROMETHEUS program ended after spending seven hundred forty nine million euros over eight years, equivalent to over one point five billion dollars today, Dickmanns and his vehicles were quietly retired. His research did not fail. It simply arrived too early. The market could not comprehend camera only intelligent vehicles. The industry turned away and autonomy went silent for nearly a decade.
The Quiet Influence He Left Behind
Although his name faded from public view, his work spread through the automotive world like a hidden genetic code. His students and collaborators carried his ideas into Germany's engineering core.
Ulrich Franke, who worked with Dickmanns on dynamic vision, joined Daimler and advanced real time stereo vision research for driver assistance. His algorithms for depth perception from moving cameras became the foundation for Mercedes-Benz's Distronic Plus system, the first production adaptive cruise control to use stereo cameras. Every Mercedes that maintains safe following distance today traces its lineage to techniques developed in Dickmanns' lab.
Reinhard Behringer brought predictive path concepts to BMW where they influenced early lane detection work. At Bosch former team members contributed to motion modeling and perception research that later strengthened advanced driver assistance systems.
These systems were not marketed as autonomous driving breakthroughs. They were sold as safety and comfort features. Yet each one reflected the same principle. Cameras interpreted motion. Computers predicted behavior. Vehicles made decisions using onboard intelligence.
Nearly a decade after Dickmanns' retirement, the autonomous driving dream suddenly resurged. In 2004 DARPA launched the Grand Challenge and offered one million dollars to anyone who could build a self driving vehicle. Teams attacked the problem with expensive lidar scanners, military grade GPS and pre mapped routes. When Stanford won in 2005, the achievement was celebrated as unprecedented.
It was not. These teams were solving a problem that Dickmanns had already solved in principle. They used lidar and large data sets. He used cameras and mathematics. They had massive compute. He had hardware from the 1980s. They were the explorers who arrived second, celebrated for discovering what had already been found.
Why His Ideas Matter More Than Ever
The modern autonomous driving industry is slowly converging on the same principles Dickmanns pioneered. Few acknowledge the source but the influence is undeniable.
Consider prediction. His four dimensional vision treated motion as something that must be understood over time rather than as isolated images. Today every major autonomy company invests in behavior prediction. When a Tesla Model 3 navigates a curved highway exit, it uses temporal modeling. Waymo runs full research teams for motion forecasting. Cruise invests heavily in trajectory prediction. The terminology changed but the principle did not. Safe driving demands prediction of future motion.
What made Dickmanns' approach revolutionary was a philosophical shift from "measure everything" to "understand motion." While modern systems often rely on sensor fusion combining lidar, radar, and cameras to create detailed 3D maps, Dickmanns proved that intelligent interpretation of visual data could achieve the same results. He treated driving as a problem of understanding dynamic patterns, not cataloging static objects.
The camera versus lidar debate followed the same path. While companies spent fortunes on rotating laser scanners, Dickmanns had already proven that cameras alone could handle highway autonomy. In 2019 Elon Musk declared that lidar is unnecessary and that cameras would solve autonomy. That statement echoed work from 1986.
Tesla's Autopilot system provides the clearest example of this lineage. The company's Full Self-Driving computer processes video from eight cameras to build a bird's eye view of the environment. It predicts the future positions of vehicles, pedestrians, and cyclists up to several seconds ahead. The system uses what Tesla calls "4D labeling" to understand how objects move through space over time. Every element, from the camera-only approach to the temporal prediction, mirrors principles Dickmanns established decades earlier.
Even modern artificial intelligence carries his fingerprints. Transformer networks use attention mechanisms to focus computation. Dickmanns did the same thing physically. His cameras moved to track relevant objects so the system ignored distractions. He designed attention before the term existed.
He also anticipated edge computing long before it had a name. While companies debate how much processing should run in the cloud, Dickmanns demonstrated full autonomy using only onboard processing. Real time local decision making remains essential in every safety critical system today.
The Lessons Innovators Must Not Ignore
The story of Ernst Dickmanns reveals how innovation truly happens. The most important lesson is that breakthrough ideas often arrive before the world is ready. He solved key challenges in autonomy years before anyone imagined a business model for it. His work survived because it was correct, not because it was commercially rewarded.
History repeats this pattern. Charles Babbage designed programmable computers in 1837 before they were practical. Hedy Lamarr patented frequency hopping spread spectrum in 1942 decades before it powered wireless networks. Dickmanns demonstrated autonomous driving in the 1990s two decades before the world cared.
This raises uncomfortable questions for technology leaders. How many breakthroughs are actually rediscoveries. How many ideas are lost because they appear too soon. How often does funding reward reinvention instead of originality.
His work also challenges the obsession with sensor quantity. His vehicles achieved remarkable results with intelligent processing rather than excessive hardware. He proved that understanding motion is more important than measuring everything. Mathematics defeated brute force. The industry learned that lesson after spending billions.
The Man Who Saw Tomorrow
Today Ernst Dickmanns lives quietly in Germany in his late eighties. He watches the world celebrate the rediscovery of his life work. Every Tesla that stays in its lane. Every Mercedes that steers itself. Every camera based driver assist system in modern vehicles. They all descend from the foundation he created.
He could have chased recognition. He could have demanded credit. Instead he chose clarity. He chose correctness. His legacy was built on ideas strong enough to survive time.
"I knew we were right," Dickmanns said in a rare interview. "The mathematics were sound. The approach was correct. Sometimes you must wait for the world to catch up."
Picture him watching a Tesla drive the same Autobahn where his Mercedes drove itself in 1995. The networks are faster. The processors are stronger. The software stacks are larger. Yet the principle remains unchanged. Vision creates perception over time. Prediction creates safety. Intelligence creates control.
Ernst Dickmanns did not just teach cars to move. He taught them to think. He did not just predict the future. He built it. The world simply needed time to understand what he had already shown.
All opinions are my own and do not reflect those of my employer.
Shawn Sehy - wow - another great post !!! I was unaware of Ernst Dickmanns until now, but as you say his work has been used in cars for decades now. The camera discussion is interesting and it's easy to understand how cameras are perfectly capable in perfect conditions, however rain, sunshine, rain, sleet, snow, darkness etc all mean more is needed than just cameras for perception. V2X should also be a critical part letting every vehicle transmit it's details to other vehicles in the immediate vicinity, think narrow lanes, blind spots etc. Which all fall nicely into your other post on edge compute.