OpenAI is reportedly facing significant challenges with the development of its upcoming artificial intelligence model, known as Orion. According to evaluations conducted on the model, improvements over previous iterations have been found to be only moderate. This has raised concerns regarding the future pace of advancements in generative AI technologies. Feedback from OpenAI employees who engaged with Orion was shared with The Information, indicating that the enhancements this new model introduces are minimal when compared to the leap taken from GPT-3 to GPT-4.
Once regarded as the leading entity in the AI sector, OpenAI may find its pre-eminence threatened by this development. The slower rate of progress could fundamentally alter expectations regarding the advancement of generative AI models, as well as the trajectory of growth for companies that rely heavily on these technologies.
Researchers at OpenAI have noted that, despite improvements in language processing capabilities, Orion is encountering bottlenecks, particularly in areas such as coding tasks. These challenges largely stem from limitations in the availability of high-quality training data. The diminishing supply of suitable data poses a substantial obstacle for OpenAI, as robust datasets are crucial for the pre-release training of Large Language Models (LLMs). There is an alarming prediction that these AI models may exhaust their training material by 2028.
As the availability of quality data wanes, advancements in certain facets of AI technology are likely to slow. Although some companies are exploring the use of synthetic data for training, the efficacy of such methods as replacements for traditional data remains uncertain.
Moreover, the escalating computational requirements for training these AI models not only heighten costs but also demand more energy. Leading technology firms are in the process of constructing additional data centres to underpin their AI strategies; however, they face difficulties in securing cost-effective energy sources. If these hurdles persist, the financial viability of developing future AI models could be jeopardised.
Energy consumption in artificial intelligence is another critical aspect under consideration, as the power needed to operate extensive data centres may lead to notable environmental consequences.
Historically, the prevailing assumption in the field of generative AI has been that an increase in data and computational power would yield more sophisticated AI models. The bottlenecks experienced by OpenAI with Orion challenge this long-held belief. Should the pace of AI innovations continue to decline, it could deter investors from funding AI ventures in the same manner as before. Consequently, it is anticipated that end-users of AI models will absorb the cost implications stemming from these shifts, as AI companies are expected to adjust pricing structures to recoup lost revenues.
Source: Noah Wire Services