Introduction:
When mid-sized companies ride the wave of large models.
Main Article:
Since the end of 2022, an AI revolution triggered by ChatGPT has been in full swing. From numerous industry and academic bigwigs joining the fray of large models to hundreds of billions of yuan in capital investments, it all proves that there is still much more potential in large models.
Over the past year, many players in the field have evolved into three major factions:
One is the internet giants. With their vast resources, strong cloud infrastructure, and strong financial backing, they use large models as an entry point, aiming to retain customers through ecosystems and cloud services.
Another is the emerging startup unicorns, led by star entrepreneurs with their glamorous stories, backed by investment institutions, and constantly in the spotlight.
The third is the “mid-sized companies” that have achieved certain success in their respective fields. Apparently, they don’t have as much money as the giants to build models worth hundreds of billions or trillions of RMB, nor are they as carefree as startups to engage in radical innovation.
The first two factions usually draw the most attention in the race, while the last one, the mid-sized companies, often seems to be overlooked. However, they are actually the key to the last mile of large model implementation. The data, scenarios, and user advantages accumulated by mid-sized companies over the years have made large models no longer just show cases or theoretical AI concepts, but practical technologies that can be truly applied in thousands of industries. This also aligns with the keyword of the large model industry in 2024: commercialization and real-world application.
In 2024, Yeahka (09923.HK), a Hong Kong-listed company with 13 years of history, is also using new AI technologies to empower its existing business scenarios in the face of the impending large model wave.
As a representative of many “mid-sized companies” in the AI 2.0 era, Yeahka’s strategy for large models is different from that of the giants and unicorns – and this may be the “shortest path” for mid-sized companies to realize the value of large models.
1. The Ecological Niche of “Mid-sized Companies” Based on Scenarios
“There is no need to compete head-on with BAT and AI startup unicorns in the underlying large model technology,” said Luo Xiaohui, the Executive Director and CTO of Yeahka.
Indeed, both the giants like Alibaba and ByteDance with abundant resources and the AI unicorns like Zhipu and Moonshot focus on parameters, model performance, and other hard conditions. It can even be said that this is their historical mission. However, this path is not suitable for all technology companies in the AI wave.
Firstly, the investment in AI underlying research and development is huge, requiring continuous capital injection for trial and error, and the outcome may not be satisfactory.
Secondly, in this AI battle, technology is only a means, and ultimately, success depends on commercial applications in specific scenarios. In other words, no matter how many resources are invested in large model technology and how powerful its performance is, it must ultimately generate value in specific scenarios.
In the final analysis, for companies in the “mid-sized company” ecological niche, it is a better choice to use large model capabilities to maintain their core business and achieve scenario implementation based on large models.
In fact, when customers with real needs start using large models, they often find that the “last mile” of AI deployment is the hardest to cross. On the one hand, general-purpose large models are good at generalizing but struggle to understand the specific know-how of businesses, often leading to “nonsense.” On the other hand, the “industry large models” deployed by giants, although claimed to be closer to customer enterprises and more efficient, still cannot perfectly meet the needs of customers based on their original business scenarios.
The gap between “technology” and “application” is a hurdle that all players find difficult to quickly and truly overcome.
In the face of this issue, Luo Xiaohui, Executive Director and CTO of Yeahka, admitted, “To bridge this gap, we cannot rely on external miracles, nor can we expect third parties from outside the industry to solve this for us. We need to explore how to integrate technology with local business needs ourselves.There is no shortcut.”
Both Liu Yingqi, the founder and CEO of Yeahka, and CTO Luo Xiaohui were pivotal in Tencent’s technical line and staunch advocates of self-developed AI. In 2017, they set up an AI lab within the company to explore AI opportunities for cost reduction and efficiency improvement. By the end of 2022, with the large model field exploding due to ChatGPT, Yeahka began its journey, trying to find the best way to combine AI with its payment business based on a deep understanding of its own scenarios.
Despite the fierce competition in the AI track, a group of “mid-sized companies” still have the opportunity to truly change the industry and provide customers with a significant increase in productivity through large models. Regarding how to find an ecological niche in the large model track, Luo Xiaohui offered an interesting perspective: “I am the ‘customer’.”
2. Rejecting copycatism
To B business often faces a dilemma – promoting technology to customers as a vendor. However, it takes time and effort to understand the client’s needs. Even if one is familiar with the scenarios, there will inevitably be differences in details.
When making products, each company has its own characteristics, and there are always non-consensus parts – internal implementation methods and business operation models vary. Among these differences, some can be summarized, while others cannot.
“The part that cannot be summarized is actually the hardest gap to cross,” said Luo Xiaohui. “The best approach is to invest in R&D based on the characteristics of the business and avoid relying on borrowed solutions. Otherwise, you won’t be able to go deep.”
In this way, the client (the B-side) develops its own technical products, utilizing the accumulated data and know-how, which allows the large model to integrate more closely with the specific scenarios. This enables them to address industry users’ pain points in a more targeted and detailed way. It can even help level the technological gap and achieve results that other vendors, despite working for a long time, would find difficult to match.
In this chain, Yeahka serves as the “client” in the traditional B2B chain—they have their own scenarios, are closer to the grassroots B-end users, and naturally have more first-hand high-quality data and industry insights. Therefore, they can more practically meet the B-end needs at a more down-to-earth level.
Thus, Yeahka has also found areas closer to its original business. In the local lifestyle service scenario, in response to the growing demands for image and text generation, short video editing, transitions, and special effects generation, Yeahka has provided a series of technical product supports and applied the most first-hand industry data to significantly optimize the results of text generation and video cloud editing. The content production cost has been reduced by 90%, and the content generation efficiency has increased by 70%.
Another typical AI scenario is intelligent customer service. With the continuous rise in labor costs and the bottleneck faced by traditional NLP technology, many AI companies have invested a lot of resources in R&D to improve customer service efficiency, but the results vary.
Many companies provide solutions for intelligent customer service by using large model technology to understand user intent, generate results flexibly, and then use knowledge graphs and retrieval-augmented generation (RAG) techniques to define the scope of output content and reduce hallucinations. However, to effectively utilize knowledge graphs and RAG, having access to a high-quality database becomes the key to success in intelligent customer service. In this regard, mid-sized companies have their own unique advantages.
By cleaning, organizing, and invoking the text and voice data of customer conversations and problem-solving records from the past 10 years, Yeahka’s intelligent customer service system developed based on the “large model + RAG” framework has achieved a higher level of user experience. Six months after its launch, the user self-service rate of Yeahka’s customer service system has increased by more than 30%, reaching nearly 90%.
Rather than rushing to package and sell pure technical output, Yeahka has focused on integrating technology with its business scenarios, serving its existing ecosystem partners and innovating incrementally on its solid foundation. Data localized to specific scenarios is one of Yeahka’s unique advantages; aggressively promoting large model products could compromise their effectiveness.
3. The Need for Localization Deployment
In the past AI 1.0 era, localization deployment was often seen as a painful issue—project-based, highly customized, labor-intensive AI… these factors often made people associate AI companies with “losing money and making noise.”
In today’s AI 2.0 era, discussions focus on “standardized products,” “out-of-the-box solutions,” and the generalization capabilities of large models, seemingly replacing the commercial models of past AI products.
However, the problem is that although everyone is talking about “standard products”, they haven’t really been developed yet. Besides the aforementioned issues such as data scarcity and industry know-how, it is also because the demand for local deployment of B-end enterprises still exists.
Taking the code generation Copilot scenario as an example, Luo Xiaohui observed that even if AI vendors emphasize “data security” to the extreme, many B-end users may still be reluctant to hand over their “lifeblood” source code.
The so-called “standard product” output in the industry seems close but is actually a long way off. To reassure customers, localization deployment is still necessary.
At the end of last year, with the growing open-source ecosystem for large models, Yeahka saw the possibility of private deployment for code generation models.
As a result, Yeahka first conducted small-scale internal trials, and then, with private deployment ensuring data security, carried out localized training based on the company’s codebase. This highlighted the proprietary advantages of their code generation tool, and they also developed their own “Yeahka AI Programming Copilot” product.
Using the “Yeahka AI Programming Copilot” for code generation, commenting, explanation, bug fixing, intelligent Q&A, and unit testing, the average code adoption rate has surpassed 20%, with some programmers’ code adoption rates reaching over 30%.
With limited resources, large-scale technology research is too costly for “mid-sized companies”; every step they take must bring visible results. Local deployment may already be a must for “mid-sized companies” to implement large model business in scenarios.
What many people fail to see is that localization deployment is a highly potential niche market for mid-sized companies:
The giants, relying on cloud platforms, ultimately aim to attract customers to their PaaS and IaaS products, so localization deployment doesn’t align with their logic.
Startup unicorns are generally smaller in scale and may not have extensive To B experience. They need to build project teams from scratch, whereas mid-sized companies that have been working in the To B field for many years naturally appear more mature in comparison.
After recognizing their own ecological niche, mid-sized companies need to roll up their sleeves and focus on localization deployment. While many vendors are still hesitating about whether they should pursue localization in the era of large models, Yeahka identified this opportunity early on and took action: “Only through localization can we maximize the value of AI large models. This is the fundamental principle we adhere to,” summarized Luo Xiaohui.
Luo Xiaohui’s thoughts on the technology development roadmap have already started to take shape. The next step for Yeahka is to leverage generative AI technology to enhance several products and services with high certainty:
One is to focus on a series of overseas expansion needs for Yeahka, such as using large models to solve language barriers and AI Agent applications. For example, Yeahka-invested company Fushi Technology is about to launch an AI Agent tailored for the restaurant industry in Southeast Asia; The second is to generate high-quality business materials to empower AI in in-store e-commerce live streaming; The third is to continue to enhance internal organizational efficiency.
Conclusion:
“The winners in the large model field are more likely to be the companies that already have clearly defined scenarios.” This is an unspoken truth within the industry. Mid-sized companies have scenarios, data, customers, and ecosystems, and they are exactly the ones who have already found the nail and are now looking for the right hammer.
“Pragmatism” is the key word for mid-sized companies. Developing applications based on existing open-source models not only keeps costs low but also allows for localization deployment to ensure the security of internal business operations and user data. In this way, they can serve existing scenarios, solve practical problems within a reasonable cost range, and stay close to cutting-edge technology without reinventing the wheel.
It’s not just Yeahka; other mid-sized companies in vertical industries are also exploring ways to better empower business scenarios with AI. Companies like Meitu, Kingsoft Office Software, and others have been cultivating their respective vertical markets and have already achieved good results.
However, making substantial investments in innovative technologies based on mature businesses, which may not yield immediate returns, is by no means an easy task. Luo Xiaohui stated that the input-output ratio of AI investments needs continuous adjustment and fine-tuning, and it also requires a strong conviction to sustain the process.
Many have asked Luo Xiaohui: With so many external suppliers available, why should you develop AI applications in-house?
Yeahka has, of course, asked itself the same question, but their perspective is much more long-term.
“This might be related to our company’s DNA,” said Luo Xiaohui. “For Yeahka, technology is the foundation of our existence. The wave of AI will profoundly change our industry. Although the exact nature of this change is still unclear, as long as we can master the technology, we will be able to weather the impact of this wave or even become its beneficiaries.”
Media ContactCompany Name: Shenzhen Yeahka Technology Co., LtdContact Person: Isabel LIUEmail: Send EmailCountry: ChinaWebsite: http://www.yeahka.com