Article reprint source: AIcore

Original source: GenAI New World

Author: Lu Ke

Image source: Generated by Unbounded AI

According to overseas media reports, although many technology companies have begun to sell generative AI technology that provides business assistant services and code generation services, most companies are still exploring the business model of generative AI.

For many companies, generative AI tools are not only unproven but also extremely expensive to run, requiring powerful servers with expensive chips and consuming huge amounts of electricity.

To this end, Microsoft, Google, Adobe and many technology companies working in the field of AI are trying to use different strategies to design marketing and charging plans for generative AI.

An industry insider said that Microsoft's first batch of generative AI products actually lost money. Microsoft is planning to make up for the losses by launching upgraded AI products with lower prices and more functions. Technology companies such as Google also plan to adopt this model. Companies such as Adobe are planning to set a monthly usage limit for users and charge according to traffic. Some companies, such as ZOOM, plan to reduce costs by developing simpler AI models.

According to Adam Selipsky, CEO of Amazon’s cloud computing division, “A lot of customers I talk to are unhappy with the cost of running generative AI models.”

Chris Young, Microsoft's head of corporate strategy, also said: "Businesses and consumers need to spend time to understand how they want to use AI technology and how much they are willing to pay for it." Chris Young said: "Obviously, we now have to transform our initial excitement and curiosity."

Compared to other types of software, building and training AI products takes years and hundreds of millions of dollars.

AI software does not have the same economic properties as most software, and it is impossible to achieve the situation where the more users, the lower the cost. Because each user query requires a lot of computing power, the more customers use such products, the higher the infrastructure fees that manufacturers need to pay, and the greater the cost. The huge operating costs make AI companies that charge fixed subscription fees, such as Microsoft, face potential huge losses.

Microsoft used the AI ​​technology of its partner OpenAI to launch GitHub Copilot, a service that helps programmers create, fix and translate code. It is very popular among programmers and has been used by more than 1.5 million people. GitHub Copilot will help users generate code and reduce the time and effort required by users when programming.

However, due to the high operating costs of GitHub Copilot, it is still a loss-making service. Each user needs to pay a $10 subscription fee for GitHub Copilot every month. But at the beginning of this year, GitHub Copilot caused Microsoft to lose an average of $20 per user per month, and some Copilot users even spent up to $80 per month.

Microsoft and GitHub did not respond to requests for comment on whether the service will make money, but if computing costs drop, the profitability of GitHub Copilot and other AI assistants could change.

In order to make up for the losses, Microsoft may charge users higher fees. It is reported that Microsoft plans to increase the subscription fees for its AI products. Take Microsoft 365 Office as an example. The basic subscription fee for the enterprise version of this office software for business users is $13. It can generate emails, create PPT and Excel tables for users, and Microsoft plans to charge an additional $30 per month after launching the enhanced version of the software.

Google has previously released similar AI assistant software, and the current subscription fee is $6. But as the version is updated and upgraded, Google may charge an additional $30 per month for subscription.

Not only Microsoft and Google, many technology companies hope to balance income and expenditure through higher additional fees. After all, the cost of AI technology is really expensive. Efficient AI services cannot be separated from the support of powerful AI models, but compared with other software and cloud services, these models consume more electricity and put more pressure on computer operation.

Take OpenAI’s GPT-4, one of the largest and most expensive AI models in the world. Using it to summarize an email is like driving a Lamborghini to deliver pizzas.

Jean-Manuel Izaret, head of marketing, sales and pricing at Boston Consulting Group, once said: "Big AI models can bring powerful intelligence, but they also require huge computing power."

If you don’t want to raise fees and anger your customers, choosing a cheaper AI model is one option.

Microsoft has long been looking to replace the less powerful but less expensive base model products for Bing search. Bing Chat is currently powered by OpenAI's ChatGPT, and Microsoft has also considered using Meta Platforms' open source AI large model building tools to provide technical support for Bing.

This cost-saving approach has been used by Zoom, which has developed a smaller, cheaper AI assistant software powered by multiple models, including those developed by OpenAI and Meta. The software only uses AI technology for calculations for the most difficult tasks.

According to Smita Hashim, Zoom's product director, users can use the AI ​​assistant to summarize meeting content or generate text content. They will not charge extra because the software avoids using expensive AI technology. Hashim said: "We were shocked by the high prices of some competitors."

In addition to raising fees and reducing costs, some companies choose to restrict user usage to reduce costs.

Adobe uses a points system to ensure that its image generation software FireFly does not lose money. Once users exceed the monthly usage limit, Adobe will slow down their service speed to prevent the function from being overused by users. According to Adobe CEO Shantanu Narayen, Adobe is working hard to provide convenience to users, but also hopes to protect itself in terms of costs.

Adobe leaders released Adobe Firefly, a family of generative AI models.

Many companies expect that the cost of generative AI technology will become lower and lower over time, just like technologies such as cloud storage and 3D animation. After all, the arrival of new chips and other innovative technologies will reduce the computing cost of AI. OpenAI reduced the cost of using its old version of ChatGPT earlier this year, allowing more users to use the free old version of ChatGPT. Those who want to use the latest version of ChatGPT need to pay a monthly subscription fee of $20. But the profit brought by this method is unclear.

The uncertainty of this charging model has deterred many investors. Share prices of AI-related companies have soared this year, and OpenAI's valuation has reached $90 billion, three times that of the beginning of this year.

According to May Habib, CEO of AI tool developer Writer, rising valuations reflect expectations for AI’s future, but as enthusiasm wanes, executives will focus more on costs. Habib believes that many investors’ AI-specific funds were canceled next year.