Reduce Cost With ChatGPT-Style Search: A 10x Increase In Efficiency

No one can deny that the stock market believes ChatGPT-style search engines to be worthwhile, displayed by Google’s $100 billion market cap drop after their poorly-run AI search event. Nonetheless, creating a successful enterprise from a chatbot may be difficult to achieve.

For the past seven years, Google has employed its Google Assistant chat search interface; yet the world’s leading advertising enterprise has failed to capitalize off this resource.

A new report from Reuters has highlighted an additional monetarily challenging factor regarding the generation of chat sessions for each search: This method will incur much greater costs than a conventional search engine.

When searching for something on Google, the web index is scanned, and results are ranked according to relevance–showing the most meaningful entries first in your search outcomes. For this process to occur, a tremendous amount of web content must be indexed beforehand.

Google states that a regular search query usually only lasts less than one second. However, constructing a ChatGPT-style search engine would require initiating an expansive neural network modeled after a human brain each time and bringing backtest results and accurate info from an index set in the system.

The interactive nature of ChatGPT means that conversations take much longer than just a brief moment; you’re sure to find yourself having an extended dialogue with the system for quite some time.

John Hennessy, Chairman of Alphabet — Google’s parent company, and several analysts, have revealed to Reuters that using AI for an exchange known as a large language model would cost around 10 times more than what is spent on a standard keyword search – costing several billion dollars.

An estimation of the amount of Google’s yearly net income, which comes to $60 billion, being absorbed through a chatbot is a source that is yet to be determined.

Morgan Stanley estimates that Google would incur a $6 billion yearly cost increase if, according to a study by consultants SemiAnalysis, its ChatGPT-like AI handled half the queries it receives with 50-word answers, while reports from SemiAnalysis forecast the cost to be $3 billion.

In its initial post on the “Bard” chatbot, Google noted that it starts with a “lightweight model version” of its language model to quickly scale the technology to more users. This relieves the server time issue due to requiring substantially less computing power for greater feedback.

Google’s caution in the arena of scale is quite striking. As they are already formidable in size and capable of handling massive quantities of data, it can be deduced that their “caution” solely relates to what they feel comfortable spending money on.

Microsoft’s engagement in search engine competitions reveals more of Google’s struggles than its own: most estimates put its Bing market share at only about 3 percent worldwide – compared to Google, possessing a heftier 93 percent. The cost of the search is becoming increasingly burdensome for Google.

Search is an extremely superior service that Google offers and something that Microsoft needs no worries about. Every day, a mindboggling amount of 8.5 billion searches are conducted, to which Google must apply a cost over each inquiry that can add up very quickly.

Under Alphabet’s direction, Google is exploring ways to make costs more efficient. This development builds on past achievements, where they successfully lowered YouTube expenses and enabled it to move into profitability. Their current endeavors include using custom processor chips for video conversion.

Alphabet says:

“A couple year problem at worst.”

Despite the cost-cutting measures taken by Google recently, their current product may face escalating prices “for a few years” due to the construction of custom server chips called Tensor Processing Units by them for machine learning.

The development of ChatGPT-style search represents a significant breakthrough in natural language processing and can potentially transform how we search for information online. However, as with any new technology, there are potential costs and challenges associated with its implementation.

According to recent reports, using ChatGPT-style search could result in a 10x cost increase for companies like Google and Microsoft. The technology requires significant computing resources to function properly, which can be expensive to maintain and scale.

Source: arstechnica

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top