Why Quantum Computing's Value Lies Beyond Stock Market Volatility

There has been quite a bit of commotion recently around the price action of quantum computing stocks, particularly following comments made by Jensen Huang, CEO of NVIDIA. Huang mentioned that he does not anticipate the emergence of 'very useful quantum computers' for another 15 to 20 years.
This opinion has sparked debate within the community—perhaps unsurprisingly, given the substantial resources being poured into quantum computing development.
However, I believe the current discourse is fixated on the wrong aspects. Instead of focusing on stock price fluctuations, we should redirect our attention to the more fundamental and promising aspects of this technology.
Lessons from Amazon: Valuing Emerging Technologies
Let’s take a step back to the late 1990s, when a small company called Amazon.com was making headlines. If you’re old enough to remember, Amazon went public on Nasdaq on May 15, 1997, with a valuation of around $300 million. At the time, it was simply an online bookstore—nothing more.
Fast forward two years to November 1999, and Amazon's valuation had skyrocketed to approximately $42.5 billion.
So what triggered this dramatic increase? Massive gains in profitability? Unprecedented sales growth?
No. The company remained unprofitable for another decade.
The change: Amazon simply expanded beyond books to offer a wider range of products. That’s it. But it was a decision that revolutionized the industry.
What Is It Worth?
This brings us to how new market opportunities are priced by investors. Having worked in finance for more than a decade, this is actually fun for me. To explain, let’s use a highly simplified formula:
The first two factors in this equation—market size and probability—are far more influential than the third.
Understanding Market Size and Timing Factors
When it comes to market size, many firms specialize in estimating potential opportunities, dedicating time and expert resources to arrive at rough figures. In the case of quantum computing, these estimates1 often exceed the trillion-dollar mark. If you take issue with that number, please take it up with them—but keep in mind, they’ve been right before, even when their figures seemed outlandish at the time.
The second factor, the probability of quantum computing becoming a reality, is the one that truly matters. But before we dive into that, let’s briefly discuss the third factor—timing.
While the timeline for the emergence of 'very useful quantum computers' is important for various reasons, it doesn’t impact today’s investment value as much as you might think. Jensen Huang suggested that such advancements are at least 15 years away, while others—including myself—believe it will happen within the next 8 years. To account for these differing views, we apply the following formula to discount each scenario:
Why the Risk-Free Rate of Return Matters
You might be wondering why the term 'risk-free rate of return' appears here. Simply put, it represents the opportunity cost of investing in quantum computing compared to letting money accumulate interest in a risk-free investment, such as a government bond. This makes sense, as the 'risk' component is already accounted for in the second factor of Equation 1 (probability of realization). Looking back at the past fifteen years, a reasonable estimate for the risk-free rate is around 3%. Of course, if you have a different outlook for the next fifteen years, feel free to adjust accordingly.
But using the 3% risk-free rate, the discount factor for an 8-year timeline comes out to 0.79. And for 15 years? It's 0.64. Yes, those numbers differ, but let’s put things into perspective—the ratio between 0.79 and 0.64 is actually smaller than the price swings we’ve seen in some quantum computing stocks within a single hour of trading.
At this point, I hope it's becoming clear that the factor truly driving the recent market frenzy is the second one in our original formula—the probability of useful quantum computing happening at all at some point in the not-too-distant future.
The Probability Factor: Why it Matters Most
After all, if you look at what sparked this recent market activity, it was Google's Quantum AI team announcing their first demonstration of Quantum Error Correction beyond the break-even point. And, in my opinion, the reaction was entirely justified. This breakthrough was the one thing that had been holding the entire field back: whether quantum error correction was technically feasible or not. By conclusively demonstrating that it is, Google significantly increased the perceived probability of success, which in turn explains the sharp rise in valuations.
Incidentally, another company had already achieved and published a peer-reviewed demonstration of quantum error correction beyond break-even before Google—yes, it happened right here at Nord Quantique (link here). Why didn’t this shake up the market in the same way? I’ll leave that to the reader’s sharp deductive skills, but here’s a hint: Google's result was actually made public months before their official year-end announcement, though it was posted on a platform primarily frequented by fellow nerds like us—'the ArXiv.' But I digress.
Why Error Correction Is a Game-Changer
So why is this particular technical milestone—quantum error correction beyond break-even—so significant? Simply put, because useful quantum computing can’t happen without it.
Sure, quantum computers already exist and are even commercially accessible, but they make far too many errors. How many? Roughly one error for every few thousand operations.
To achieve truly useful quantum computing, this error rate needs to be reduced to about one in a billion operations. That’s a massive challenge. And that’s precisely why error correction is absolutely essential.
One crucial point that Google’s announcement didn’t emphasize (we can guess why) is that for error correction to be truly workable and scalable, it must be cost-effective.
If you look closely at Google’s Nature paper, you’ll find that their current approach requires a physical overhead of up to 1450 times the original system size. In other words, to build a practical quantum computer using their method, it would need to be 1450 times larger just to accommodate error correction.
At Nord Quantique, we see this as unrealistic. That’s why we’ve designed our system with no overhead ratio for error correction—achieving a 1:1 ratio instead. We demonstrated this breakthrough last year, and while we acknowledge that scaling remains a challenge, we firmly believe we’re on the right track to achieve it in the most efficient way possible.
Keeping the cost of building the system under control is a critical part of this journey.
Back to Amazon: A Case Study in Scaling
We began this discussion with Amazon, and it’s worth revisiting the comparison. It’s easy to dismiss it as an imperfect analogy for quantum computing, but I believe it holds up quite well.
Looking back, it wasn’t at all obvious that Amazon’s early business model could scale in any meaningful way. In fact, many considered it an almost impossible challenge—and they weren’t wrong! Scaling Amazon took an immense effort and a lot of time. But they made it. Today, Amazon’s valuation stands at over $2.3 trillion—yes, that’s trillion with a ‘t.’
It’s also worth noting that after 1999, Amazon’s stock took a significant dip—just like quantum stocks have recently.
However, anyone who bought Amazon stock at its peak back then and held onto it would hardly call it a bad investment today, considering it has skyrocketed by a whopping 5500% since.
Choosing the Right Investment in Quantum Computing
Another crucial factor for investors—one we haven’t discussed yet—is the challenge of choosing the right company to back.
While Amazon emerged as a dominant force in e-commerce, it wasn’t the only success story—dozens of others have thrived in the space. The same challenge exists in quantum computing, with numerous competing approaches and physical modalities for building a viable quantum computer.
However, over the past year, this landscape has become much clearer. Increasingly, we have a better understanding of which technologies have a realistic shot at scaling up to a truly useful quantum computer within a reasonable timeframe.
DARPA in the US has launched the Quantum Benchmark Initiative (QBI) to address this very issue. Through this program, they aim to identify and support companies that have presented credible roadmaps to achieving value-added, utility-scale quantum computing within the next eight years—essentially their way of saying 'very useful.'
Unsurprisingly, scalable error correction remains at the heart of this effort.
For investors, this is a pivotal moment. As the field advances rapidly, those who can identify and support the right technologies and teams stand to benefit significantly. The path to scalable, practical quantum computing is becoming clearer, and with it, new opportunities are emerging.
Stay tuned—there’s much more to come as we continue to explore the investment potential of this transformative technology.
About the Author
Co-Founder of Nord Quantique, Phil obtained his PhD in Physics from Université de Montréal and spent 12 years working in the financial sector. As an investor, he supported the world's first quantum computing company and sat on its board of directors from 2015 to 2019. As head of business development for Nord Quantique, he focuses on the commercial development of our fault-tolerant quantum computers. When he's not working, he enjoys cross-country skiing and cycling.
[1] McKinsey: https://www.mckinsey.com/featured-insights/the-rise-of-quantum-computing
BCG: https://www.bcg.com/publications/2023/enterprise-grade-quantum-computing-almost-ready