Article reprint source: AI Trends

Source: Data Ape

Author: Yisuo Yanyu

Image source: Generated by Unbounded AI

At present, big models have gone from being a cutting-edge research in academia to a hot topic in the business world. At present, domestic technology giants and emerging startups are actively exploring the commercial applications of big models, trying to use them to bring innovation and growth to existing businesses.

This "forced march" trend is manifested in two aspects: first, big models are being deeply integrated with traditional core business systems such as ERP, CRM, BI, finance, marketing, operations and customer service to make them more intelligent and automated; second, big models are beginning to be widely used in multiple industries such as finance, manufacturing, retail, energy and entertainment, promoting industry innovation and transformation.

Based on the trial of ChatGPT's networking model, Microsoft Bing, Baidu Wenxin Yiyan, Taobao Wenwen and other products, the author found that there are still obvious problems in the commercial use of large models.

Specifically, in order to achieve commercial success, these three problems must be solved:

Docking system

As big models are gradually integrated into daily business operations, their functions have gone beyond simple data processing and calculation. This new type of intelligent model needs to be able to interact with numerous business systems in real time and respond to various business needs. In theory, this is the key to the real value of big models, but in practice, it is also a major technical challenge.

We need to realize that every business system has its own unique historical background and technical architecture, which gives it a unique identity. They do not exist by chance, but are designed and developed based on specific historical backgrounds, business needs and technological trends.

For example, early ERP systems may have been created in an era when computing resources were limited and networks were not mature enough. Their design concepts, data structures, and functional features were closely related to the technology and business environment at the time. They may have been based on traditional relational databases and service-oriented architectures, rather than modern microservices or container technologies.

In contrast, modern marketing automation platforms grew up in the era of cloud computing and big data. They naturally possess powerful data processing capabilities, dynamic scalability, and rich API interfaces.

This technical difference fundamentally determines the direction of the integration strategy of large models and these systems. It is undoubtedly unrealistic to try to unify all systems under one standard.

Therefore, the integration strategy with the big model must be diverse, and it needs to take into account the characteristics and needs of each system. Specifically, for those systems based on old technologies, it may be necessary to introduce some "adapters" or "middle layers" to transform data and business logic so that they can be smoothly connected with the big model. For those systems that have already adopted modern technologies, integration may be more direct and simple, but the consistency and integrity of the data must still be ensured.

In addition, in the widespread application of information technology, interfaces play the role of "bridges", responsible for information transmission and communication between different systems. Interface standardization has long been a pursuit in the IT field, but due to the development of technology and historical accumulation, the diversity of interfaces has become inevitable.

This diversity of interfaces poses a severe challenge to the integration of large models. Behind each interface standard or protocol, there is a specific data structure, calling method and security mechanism. In order to allow large models to interact seamlessly with these systems, a corresponding adapter must be developed for each interface. This means that in addition to the maintenance of the large model itself, these adapters also need to be frequently updated and optimized to cope with the iteration of business systems and changes in interfaces.

How to solve these problems? API management and microservice architecture are a good development path. By adopting API management tools and microservice architecture, enterprises can modularize the interaction between large models and other systems to make them more flexible and scalable.

The core idea of ​​microservice architecture is to decompose a large, complex system into many independent, small services, which run independently and interact through clearly defined APIs. This architecture brings significant benefits to the integration of large models. By dividing the functions of the entire system into multiple microservices, the interaction between each part and the large model becomes more flexible.

Each microservice can be independently expanded, deployed, and maintained without affecting other services. At the same time, API management tools provide developers with a unified platform to connect each microservice with the big model.

Data connectivity

In today's data-driven era, the big model is like a huge intelligent "heart" that processes, analyzes, and provides intelligent recommendations and decisions for various business systems. These business systems, from CRM to ERP, to finance and marketing, are like blood vessels and organs, intertwined with the big model and complement each other. The blood flowing through this system is data.

Ideally, every transaction, every user behavior, and every customer feedback will generate data. These data are transmitted from the business system to the big model, analyzed and processed, and then returned to the corresponding business system to provide users with more accurate services or decisions.

Let’s look at an example.

Suppose there is a Miss Wang, who is a loyal user of a well-known online shopping platform. Every time she browses products, adds products to the shopping cart, or makes a purchase, the background of the shopping platform will silently record these behavioral data. When Miss Wang's behavioral data is transmitted to the big model in real time, the model will immediately perform in-depth analysis, combining her past shopping records and browsing history. The big model quickly recognizes that Miss Wang has a strong interest in summer women's clothing recently and may need some accessories to match her newly purchased dress.

When she uses the e-commerce platform's big model app, she can interact with the app in real time and ask the big model to recommend some products. In this case, the big model can recommend a series of shoes, bags, and even other summer accessories that match the summer dress.

If she clicks on one of the recommended shoes, browses the details, and finally decides to buy, this purchase behavior is also recorded and the data is transmitted to the big model. In this process, we can see the importance of smooth data flow between the big model and the business system for providing accurate services and decision-making.

However, the above is only an ideal situation, and various problems may arise in reality. First of all, it is a difficult problem to connect the data of various business systems with the big model.

Take Taobao Q&A as an example. Currently, Taobao Q&A has not integrated its data with the Taobao system. Taobao Q&A does not know the user’s preferences. It is like an information island embedded in Taobao and has not been organically integrated into the entire Taobao data system.

Furthermore, even if the data between the big model and the business system is connected, due to the different historical backgrounds, technical architectures and data standards of each business system, there is a high probability that there will be "blocking points" or "leakage points" in the data flow process. This may not only lead to data loss, but also cause deviations in the analysis results of the big model.

Take e-commerce platforms as an example. When users browse products and make purchases, these behavioral data will be transmitted to the big model for analysis to recommend products that are more suitable for users. However, if the data is lost during transmission or does not match the data format of other systems, the big model may not be able to accurately recommend products, thus affecting the user experience.

The data flow between big models and various business systems is particularly important, not only because the amount of data is increasing, but also because the role of data in bringing value to the enterprise is changing. However, it is not easy to achieve smooth and fidelity flow of data between big models and various systems.

We need to understand that the data flow between the big model and the business system is not a simple data migration or transfer, but a complex, two-way, and continuous process. In this process, each business system may interact frequently with the big model, and the big model itself is constantly updating, learning, and evolving.

There are countless technical and business problems hidden behind such data flow. For example, due to the different update frequencies and timings of different systems, the data in the big model may be inconsistent with the data in a certain business system at a certain point in time. What's worse, different business systems may use different technical architectures, data formats and interface standards, resulting in frequent conversion and adjustment of data during the flow process.

Data security and privacy issues cannot be ignored. Data may be subject to various threats during transmission, storage and processing. How to ensure the integrity, confidentiality and non-repudiation of data has become a major problem for enterprises. Especially in cross-regional and cross-network environments, data transmission may also be delayed, which is fatal to business systems that require real-time response.

Business Integration

Big models have gradually penetrated into various industries and fields, becoming an important boost to enterprise intelligence. However, making technology truly bring value to the business is not just a matter of technical implementation, but more importantly, the close integration of technology and business. To achieve this, big models must go deep into business details, understand business logic, and be fully integrated into the entire business system.

Imagine a large e-commerce company wants to optimize its product recommendation system through a large model. To do this, it is not enough to let the model recognize the user's purchase history, it also needs to understand the user's shopping habits, interest preferences, search history and many other details. Furthermore, the large model must also be able to understand business strategies such as seasons, festivals, and promotions to ensure that the generated content is truly valuable.

This brings up a key question: How can the big model understand and integrate these business details? Specifically, we can start from the following aspects:

1. The transfer of business knowledge breaks the limitations of data.

Data is undoubtedly the core input of big models, but to achieve true business understanding, relying solely on data is not enough. Much business knowledge is implicit and unstructured, which makes it difficult to convey through traditional data methods. For example, a company's core values, its long-term relationship with customers, and subtle changes in the industry may not be directly reflected in the data. Such knowledge, if ignored, may cause the model to make decisions that deviate from the real business scenario.

Therefore, close cooperation with the business department is essential. The business department has rich experience and in-depth knowledge of the business, and they can provide details that data cannot cover. This is not only about internal knowledge of the company, but also involves the dynamics of partners, competitors, and even the entire industry.

An approach worth considering is to set up a specific business knowledge team, which can be composed of business experts, data scientists, and model engineers, who work together to ensure that large models are fully and deeply trained on the business.

2. Adapt to complex business logic and customized development of large models.

The diversity of industries leads to the complexity of business logic. A big model used in the financial industry is unlikely to be directly applicable to the retail or medical industries because these industries have their own unique business rules and logic, which requires the big model to be highly customized during design and development.

The architecture, parameters, and even algorithms of large models may need to be adjusted for specific businesses. For example, some industries may place more emphasis on real-time performance, while other industries focus more on long-term strategies, which may lead to a trade-off between computing speed and in-depth analysis.

3. Adapt to business changes, flexibility and iteration capabilities of large models.

Business is not static, it changes with time, market and technology. When business logic and rules change, the big model also needs to be adjusted accordingly.

This requires not only a certain degree of flexibility in the model design, but also the ability to quickly iterate and optimize in the later stages. Continuous model training, real-time business feedback, and the model's online learning capabilities are all key to ensuring that large models are synchronized with the business.

In the future, we foresee that big models will be further integrated with business, no longer just a tool for data processing and analysis, but will become the core driver of the entire business process. This is not only a technological advancement, but also a comprehensive transformation of business models, organizational structures and working methods.

However, such a transformation cannot be achieved overnight. It requires the joint efforts and collaboration of enterprise leaders, business teams, and technical teams. Continuous learning, experimentation, and optimization are required to ensure that the big model can truly bring value to the business. In this process, you may encounter various challenges and difficulties, but it is these experiences that will accumulate valuable knowledge and capabilities for enterprises and help them stand out from the competition.