Tech investors want to see how much industry leaders are increasing profitability now that they are decreasing costs. But there is one area where they would like to see significant investment: artificial intelligence.
This week, Alphabet, Microsoft, Amazon, and Meta reported quarterly results, providing Wall Street with an update on their efforts to improve efficiency as economic concerns mount. When it comes to AI and the recent surge in so-called large language models (LLMs), which power technologies like ChatGPT, the mega-cap tech businesses can’t afford to fall behind.
Generative AI programmes utilise growing quantities of data and processing power to generate outputs that appear to be created by humans — a block of text, a snippet of code, or a computer-generated image. They necessitate specialised supercomputers, which are not cheap.
This week, tech CEOs spoke at length on the possibilities of AI, whether they’re inventing their own models or fast integrating it into products. Their emphasis on the massive quantities of money they will spend to construct and run these applications was a recurring issue.
The following is what executives from Alphabet, Microsoft, Amazon, and Meta said to analysts:
Due to the apparent threat that smart chatbots pose to Alphabet’s primary Google search engine, Sundar Pichai, the company’s CEO, is under extreme pressure to create AI products. Recently, the business issued an internal “code red.”
On the company’s earnings call on Tuesday, Pichai stated that the company was making “good progress” towards its AI objectives.
“We’ll continue to incorporate generative AI advances to make search better in a thoughtful and deliberate way,” Pichai said.
He said that Google is utilising AI to raise the conversion rate of advertisements and lower the quantity of “toxic text” sent into AI models. Additionally, the organisation is merging DeepMind and Brain, two of its main AI teams.
Pichai claimed that in addition to employing Nvidia processors, which make up the vast majority of graphics chips required to train and implement cutting-edge AI, Google also uses its own in-house developed chips to power its models.
Microsoft’s Teams teleconferencing system, Office, and Bing search engine all make use of OpenAI’s GPT technology.
According to CEO Satya Nadella, AI is already causing an increase in app downloads from the company and will eventually drive revenue growth. Since Microsoft added a chatbot, for instance, Bing’s downloads have increased by a factor of four, he said. Through its integration with Bing, Microsoft has produced more than 200 million images.
The large datacenters required to operate AI applications would demand a sizable investment, according to Nadella.
“We will continue to invest in our cloud infrastructure, particularly AI-related spend, as we scale to the growing demand driven by customer transformation,” Nadella said. “And we expect the resulting revenue to grow over time.”
In response to a query from an analyst about the company’s ambitions for generative AI on Thursday, Amazon CEO Andy Jassy provided an unusually detailed response. Amazon is creating its own LLMs and inventing data-center processors for machine learning, according to Jassy, who emphasised how large the market is.
“These large language models, generative AI capability, has been around for a while. But frankly, the models were not that compelling until about six to nine months ago,” Jassy said. “They have gotten so much bigger and so much better so much more quickly that it really presents a remarkable opportunity to transform virtually every customer experience that exists.”
Jassy added that due to Amazon’s size, it would be able to develop LLMs, which can involve hundreds of computers working for weeks under the supervision of high-priced machine learning engineers.
“There will be a small number of companies that want to invest that time and money and we will be one of them at Amazon,” Jassy said.
For Amazon, selling access to the technology via its Amazon Web Services segment is the primary objective, in contrast to Microsoft and Google. To help engineers create code, for example, Amazon will work on specific applications, according to Jassy.
“Every single one of our businesses inside of Amazon are building on top of large language models to reinvent our customer experience,” Jassy said. That includes voice assistant Alexa, he said.
After refocusing his organization’s efforts on the metaverse in late 2021, Meta CEO Mark Zuckerberg made an effort to dispel the myth that his corporation is no longer primarily concerned with it.
He did, however, want investors to understand that Meta may invest in metaverse technology while also pouring a tonne of money into AI, which he referred to as a “key theme” for his business.
According to Zuckerberg, the firm has utilised machine learning to power products like Facebook’s news feed and ad systems and to offer recommendations, but generative foundation models are now the company’s primary area of concentration.
“It’s been a pretty amazing year of progress on this front, and the work happening now is going to impact every single one of our apps and services,” Zuckerberg said.
According to him, the business would use the technology to develop a range of goods, such as chat features for Facebook Messenger and WhatsApp, tools for creating photos for Facebook and Instagram posts, and eventually software that could generate complete videos from brief descriptions.
He is particularly interested in the idea of “AI agents,” which is a term used to describe AI programmes that can complete tasks.
“There’s an opportunity to introduce AI agents to billions of people in ways that will be useful and meaningful,” Zuckerberg said. One possibility for an AI agent would be to handle customer service for businesses, Meta has said.
Zuckerberg spoke on the significant expenditures the business made to equip its datacenters for AI applications. He claimed that the “main driver” of the increase in capital expenditures made by Meta over the previous few years was technology.
“At this point we are no longer behind in building out our AI infrastructure,” Zuckerberg said.
It doesn’t follow that Meta has stopped purchasing graphics processors. In order to “continue investing,” the firm will need to deploy its generative AI products and gain a better understanding of the resources needed, according to Zuckerberg.
(Adapted from CNBC.com)