How to view the future of OpenAI?
By Wang Ziwei @ Retail Wei Observation
On September 26th, the Wall Street Journal reported that OpenAI is in negotiations with investors regarding the possibility of selling company stock based on its valuation. This would allow employees to cash out by selling stocks to external investors.
At the same time, OpenAI's valuation has skyrocketed from around $27 billion in March of this year to $80-90 billion. In terms of revenue, sources familiar with the matter cited by the Wall Street Journal estimate that OpenAI's revenue will reach $1 billion this year and several billion dollars by 2024.
With a valuation of $90 billion and revenue of $1 billion, OpenAI's price-to-sales ratio (P/S) reaches an astonishing 90 times, far exceeding that of tech giants like Microsoft and Nvidia.
Let's analyze the story that OpenAI will tell in order to justify its $90 billion valuation while rapidly increasing its revenue. Our analysis will be conducted from two perspectives: 2B and 2C.
【One】2B: Large Language Models Will Give Birth to "Model Factories"
Large language models (LLMs), such as GPT-3, have continuously broken through the limitations of scale with the exponential growth of computer power.
Currently, the scaling law of models is still being pushed forward. It is foreseeable that larger-scale LLMs will emerge in the future, possessing even stronger language understanding and generation capabilities than smaller models.
However, training top-tier LLMs is extremely difficult. Looking at global enterprises, only a few can achieve this. Moreover, even if one obtains the model's parameters, there is a lack of professional knowledge to start pre-training from scratch. These two points are precisely the core problems that ordinary enterprises face in the era of large AI models.
Therefore, the development of LLMs must be outsourced to "model factories" like OpenAI. With its strong capabilities in training top-tier models, OpenAI can derive and design thousands of small customized models to meet the personalized needs of different customers. This is similar to how Apple designs custom A-series chips for different models of iPhones.
The current biggest obstacle to using large models like GPT is cost (the second is data compliance and privacy). However, as new models like GPT-4 continue to be developed, hardware costs will decrease due to economies of scale, and usage costs will also significantly decrease. The innovation and optimization of OpenAI's "model factory" in the future will make advanced language models accessible to the public.
For OpenAI, the profitability of the 2B model depends on the benefits brought by the improvement of customer productivity, and a certain "technology usage tax" can be charged. This "Model-as-a-Service" model will also pose a challenge to traditional SaaS companies.
【Two】2C: Subscription is Just the First Step, the Future is the "Super Entry Point"
In March of this year, The Information reported that 70-80% of OpenAI's revenue comes from 2C subscriptions, specifically the $20/month Chat GPT Plus, with a subscriber base of around 2 million at the time.
It must be pointed out that any new technology may be valueless until we find a "killer application." Jeff Bezos, the founder of Amazon, once said in a speech that electricity was difficult to enter ordinary households because people didn't know what electricity was for until someone invented the light bulb, and then electricity truly became infrastructure.
Later, Bezos also said in an interview that invention and creation are not disruptive, but acquiring users is the real disruption. Chat GPT can be seen as such a case, and Chat GPT Plus is also a recognition of GPT by users.
Therefore, a $20 monthly subscription fee can only be a small trial to make everyone realize that users are willing to pay for this application, which is a killer application.
In fact, the real vast ocean for 2C is to become the "super entry point" on mobile devices.
Imagine picking up your phone, unlocking it, and seeing only one OpenAI intelligent assistant app on the home screen, just like when the iPhone removed the keyboard and left only the "Home" button.
You say to the intelligent assistant, "Help me plan a trip to Japan," and the intelligent assistant can proactively contact services such as flights, hotels, and attractions, and provide you with a detailed itinerary that includes all the details. It can also adjust the details according to your requirements. OpenAI may only need a few minutes to generate a very comprehensive plan for you. This means you don't need to personally open apps like Xiaohongshu and Mafengwo to look at travel guides, or compare flight prices on Ctrip and Fliggy. The "knowledge" built by many apps on your phone will instantly collapse.
As for whether this intelligent assistant will talk nonsense, let's put it this way: on the one hand, OpenAI is already capable of being connected to the internet, and on the other hand, even in the era before internet connectivity, a friend who specializes in customized African tours told me that their tests showed that under the same customer requirements, the solutions provided by Chat GPT were not inferior to those of travel planners with 7-8 years of experience.
As for price comparison, this can be easily solved. Just refer to Amazon's logic: Amazon uses dynamic pricing to ensure that the prices of products on its website are lower than those of competitors. Once consumers become familiar with this, they will no longer compare prices but will directly "one-click order" on Amazon. What they don't know is that Amazon is no longer the cheapest option for many things at this point.
For OpenAI, this 2C super entry point model can strongly challenge the revenue of mobile app stores. The global revenue of smartphone app stores has already reached $50 billion, but this is just the beginning. With the emergence of super entry points, the $450 billion global digital advertising revenue may be redistributed.
【Three】Conclusion
Whether it is the 2B "model factory" or the 2C "super entry point," OpenAI has the potential to ride the waves and enter a trillion-dollar race. The 2B and 2C tracks themselves will also generate synergy. Even if only one model ultimately succeeds, achieving a revenue target of several billion dollars is a minor issue.
The future of these stories depends on whether the scaling law of models continues to advance. As long as the three elements - computing power, data, and algorithms - continue to make progress, LLMs will inevitably become more refined, and costs will decrease. If all obstacles can be overcome, OpenAI's valuation is not a fantasy. This is what we can look forward to in the future.
As Microsoft founder Bill Gates said, most of us overestimate what we can do in one year and underestimate what we can do in five years.
"Retail Wei Observation" focuses on the latest strategies, tactics, and thoughts in the field of new retail and new consumption from a global perspective. The platform's founder, Wang Ziwei, is an independent analyst.