Glean provides tools for searching through apps such as Gmail, Slack, and Salesforce. Qi stated that the new AI technology for parsing language will help Glean’s customers discover the correct file or dialogue faster.
But training such a cutting-edge artificial intelligence algorithm would cost millions of dollars. So Glean uses smaller and weaker AI models, which cannot extract as much meaning from the text.
“It is difficult to achieve the same level of results in a small place with a smaller budget”, just like a company Google or Amazon, Qi said. He said that the most powerful artificial intelligence model is “impossible.”
In the past ten years, artificial intelligence has produced exciting breakthroughs-these programs can beat humans in complex games, drive cars through city streets under specific conditions, respond to verbal commands, and write them according to brief prompts. Coherent text. Writing especially relies on the latest developments in the ability of computers to analyze and manipulate language.
These advances are largely the result of providing algorithms with more text as learning examples and providing them with more chips to digest it. And it costs money.
consider OpenAI Language model GPT-3, A large, mathematically simulated Neural Networks This is a large amount of text scraped from the web. GPT-3 can find statistical patterns to predict which words should follow other words with surprising consistency. Out of the box, GPT-3 is significantly better than previous AI models in tasks such as answering questions, summarizing text, and correcting grammatical errors. Measured in a certain way, its capacity is 1,000 times that of its predecessor, GPT-2. But the cost of training GPT-3, According to some estimates, Nearly 5 million U.S. dollars.
“If GPT-3 is easy to obtain and inexpensive, it will completely enhance our search engine,” Qi said. “That would be very, very powerful.”
For established companies that want to build artificial intelligence capabilities, the rising cost of training advanced artificial intelligence is also a problem.
Dan McCreary leads a team in a division of Optum, a health IT company, that uses language models to analyze call records to identify higher-risk patients or recommend referrals. He said that even training a language model that is one-thousandth the size of GPT-3 will quickly exhaust the team’s budget. The model needs to be trained for specific tasks, and the cost may exceed $50,000, which is paid to cloud computing companies to rent their computers and programs.
McCreary said there is no reason for cloud computing providers to reduce costs. “We can’t believe that cloud providers are working hard to reduce our cost of building AI models,” he said. He is considering buying a dedicated chip designed to accelerate artificial intelligence training.
Part of the reason artificial intelligence has developed so rapidly recently is because many academic laboratories and startups can download and use the latest ideas and technologies. algorithm For example, it has made a breakthrough in image processing. It comes from an academic laboratory and was developed using off-the-shelf hardware and publicly shared data sets.
But over time, it has Become clearer and clearer The advancement of artificial intelligence is related to the exponential growth of the underlying computer capabilities.
Of course, large companies always have advantages in terms of budget, scale, and influence. Plenty of computer power is a bet for industries such as drug discovery.
Now, some people are pushing for further expansion.Microsoft Said This week, it worked with Nvidia to build a language model twice the size of GPT-3.Chinese researchers Say they have built a language model that is four times larger Compared to that.
“The cost of training artificial intelligence is definitely going up,” said executive director David Kanter. MLCommons, An organization that tracks the performance of chips designed for artificial intelligence. He said that in many areas of the technology industry, the idea that larger models can release valuable new features can be seen.It can explain Why Tesla designs its own chips Just to train artificial intelligence models for autonomous driving.
Some people worry that the rising cost of leveraging the latest and greatest technologies may leave innovation to the largest companies and companies that lease tools, thereby slowing the pace of innovation.
“I think it does reduce innovation,” said Chris Manning, A professor at Stanford University, specializing in artificial intelligence and language. “When we have only a few places where people can play with the internal structure of these scale models, this must greatly reduce the amount of creative exploration that takes place.”