On June 26, DingTalk hosted the Make 2024 DingTalk Ecosystem Conference in Beijing, where it officially launched its AI-powered search feature and enhanced the AI assistant introduced in January. Significantly, DingTalk announced the full opening of its model layer, partnering with six large model companies to build a more open AI ecosystem.
AI search is becoming a hot topic. Perplexity AI, with its unique Q&A engine, recently reached a valuation of over USD 1 billion in its Series B funding round. Google is integrating Gemini into its search engine, while Chinese players like TianGong AI, Meta Sota, and Genspark from MainFunc have entered the market. Unlike these products, DingTalk’s AI search solution primarily targets internal enterprise scenarios.
DingTalk aims to resolve the fragmentation of information, a major pain point for users. In a company’s internal communications, vast amounts of information are generated daily across various collaboration software, accumulating chat records, documents, meeting minutes, and business data. Traditionally, employees needed to sift through past chat logs to find necessary information, resulting in inefficiency. With the launch of its AI search feature, DingTalk can understand user queries through natural language processing (NLP), integrating fragmented information scattered across documents, chat logs, and meeting notes into a coherent knowledge network, such as mind maps and outlines, making information more accessible.
Another highlight is the feature’s capability in predicting and recommending information that users might need, based on their work scope and historical behavior. Additionally, the AI search feature can delve into internal data including personal and team schedules, to-do lists, and project details, enabling comprehensive information collection and organization. This means users can find necessary information and trace related personnel, organizations, and documents for a multidimensional overview.
Differing from user-personalized AI assistants that require manual data input, DingTalk’s solution integrates platform data, promising a differentiated user experience through data accumulation. In a post-conference interview, DingTalk CEO Ye Jun said that, while current AI capabilities are used primarily for general search, deeper application at enterprise level will incorporate more data into the system, facilitating specialized use cases.
DingTalk COO Fu Xujun said that, unlike traditional and other public AI search engines, DingTalk’s version focuses on the B-end, aggregating data from business, management, and personal usage scenarios to provide accurate and personalized search results.
Looking ahead, DingTalk hopes to combine AI technology with knowledge graph (KG) technology to form a networked structure of information, enhancing task execution. Discussing challenges in developing AI search, Fu acknowledged that presenting data accurately and objectively for internal enterprise use is a key challenge compared to consumer-facing use cases.
The business intelligence (BI) scenario exemplifies this. Fu explained that traditional BI scenarios are long and demanding, such as accurately fetching data and choosing appropriate visualizations like line or pie charts, analyzing trends, and addressing errors during the process. “Our goal is to eliminate hallucinations and leverage large models’ strengths, like understanding, abstraction, summarization, and emergent capabilities, to improve,” said Fu.
In addition to AI search capabilities, DingTalk upgraded the AI assistant released in January. The update connects the AI assistant with the Tongyi Qianwen large model by default and enables users to switch between various large models developed by companies like MiniMax, Moonshot AI, Zhipu AI, OrionStar, 01.AI, and Baichuan AI, based on their needs. Moreover, the AI assistant now boasts enhanced memory and reasoning capabilities, remembering relevant information and breaking down complex tasks. Its perception system also adds time awareness, better coordinating team schedules to avoid conflicts.
DingTalk introduced multi-agent collaboration and anthropomorphic capabilities. Multi-agent collaboration allows multiple AI assistants to work together in a workflow, breaking down complex tasks sequentially. For example, in planning a marketing event, three AI assistants can collaboratively complete tasks like identifying hotspots, analyzing events, and brainstorming creative ideas, while automating the process. Anthropomorphic operations enable users to demonstrate complex operations to the AI assistant through dialogue, enhancing convenience and efficiency.
Ye said that DingTalk will share its AI capabilities with ecosystem partners and customers through two APIs, pushing AI into application across various scenarios. Beyond new AI products, the strategy of ecosystem openness continues. Last April, DingTalk integrated the Tongyi Qianwen large model, followed by offering its AI platform-as-a-service (PaaS) solution to ecosystem partners and customers in August, helping them redesign products with large models.
Currently, DingTalk’s ecosystem includes over 5,600 partners, with more than 100 AI ecosystem partners, covering areas like AI solutions and plugins. DingTalk said its AI services are now called over 10 million times daily. Significantly, DingTalk, which is part of Alibaba Group, is now fully open to multiple large model ecosystem partners at the model layer, including product and scenario access and AI assistant development platforms.
As of now, six companies—MiniMax, Moonshot AI, Zhipu AI, OrionStar, 01.AI, and Baichuan AI—have integrated with DingTalk, jointly exploring various real-world scenarios. Developers can choose different large models from these companies for personalized development.
At the conference, DingTalk also launched a startup version, priced at RMB 980 (USD 135) per year, alongside its other commercial versions. This year, DingTalk has also reduced prices, offering customized workbenches, enterprise mailboxes, annual inspection certification, and enterprise cloud disks in the startup version. DingTalk announced free one-year usage support for 10,000 new registered enterprises.
Whether expanding coverage internally or collaborating with large model partners externally, the goal is ultimately to increase AI adoption. As DingTalk is showcasing, this consensus has become a common goal among AI players this year.
KrASIA Connection features translated and adapted content that was originally published by 36Kr. This article was written by Wang Yixin for 36Kr.