Technical Deep Dive
The debate between Token and DAA is rooted in fundamentally different architectural and operational philosophies.
Token-Centric View (The NVIDIA Paradigm): This view treats the LLM as a black box that converts input tokens into output tokens. The key performance indicators (KPIs) are tokens per second (TPS), time to first token (TTFT), and cost per million tokens. The engineering focus is on maximizing hardware utilization (FLOPS), reducing latency through techniques like speculative decoding and KV-cache optimization, and scaling up model size and context windows. The value proposition is simple: faster, cheaper, and larger-scale text generation. This is a supply-side metric that rewards raw efficiency.
DAA-Centric View (The Baidu Paradigm): This view treats the LLM as the 'brain' of an agent. The agent is a software program that uses the LLM to perceive its environment, reason, plan, and execute actions to achieve a user's goal. The KPIs shift dramatically. They include:
* Task Success Rate: Did the agent accomplish what the user asked?
* User Retention: Do users come back to the agent daily?
* Average Session Length: How much time do users spend interacting with the agent?
* Tool Call Accuracy: How reliably does the agent call external APIs (e.g., booking a flight, querying a database)?
The engineering focus here is not on raw token generation but on agentic frameworks. This involves:
* Planning & Reasoning: Implementing architectures like ReAct (Reasoning + Acting) or Tree-of-Thoughts to allow the agent to break down complex tasks.
* Memory Management: Building long-term and short-term memory stores (often using vector databases like Chroma or Milvus) so the agent can remember user preferences and past interactions.
* Tool Integration: Creating robust, error-handling interfaces for the agent to call external APIs and databases.
* Safety & Alignment: Ensuring the agent doesn't take harmful or unintended actions in the real world.
A relevant open-source project that embodies the DAA-centric view is AutoGPT (GitHub: Significant Engine, ~165k stars). It was one of the first projects to demonstrate an autonomous agent that could chain LLM calls to achieve a goal, like researching a topic and writing a report. More recently, CrewAI (GitHub: joaomdmoura/crewAI, ~25k stars) has gained traction for orchestrating multiple specialized agents to work together on complex workflows. These projects show that the real innovation is happening at the orchestration layer, not just the model layer.
Data Table: Token vs. DAA Performance Metrics
| Metric | Token-Centric (e.g., GPT-4o) | DAA-Centric (e.g., Baidu ERNIE Agent) |
|---|---|---|
| Primary KPI | Tokens per second | Daily Active Agents (DAA) |
| Latency Focus | Time to First Token (TTFT) | End-to-end Task Completion Time |
| Cost Driver | Compute (GPU hours) | API calls + Tool execution + Memory storage |
| Optimization Goal | Maximize throughput | Maximize task success rate & user retention |
| Failure Mode | Hallucination, incoherent text | Agent gets stuck in loop, fails to call tool correctly |
| Benchmark | MMLU, HumanEval, MT-Bench | AgentBench, WebArena, custom task-specific tests |
Data Takeaway: The table illustrates a fundamental shift in engineering priorities. A model that excels at token throughput (high TPS, low cost) can still fail as an agent if it cannot plan, use tools, or remember context. The DAA-centric view demands a new set of benchmarks and optimization targets that are closer to real-world utility.
Key Players & Case Studies
The Token vs. DAA debate is personified by two of the industry's most influential figures.
Jensen Huang (NVIDIA) - The Token Evangelist: Huang's entire business model is predicated on the token-centric view. NVIDIA sells the shovels in the AI gold rush. More tokens mean more GPU demand. His keynote speeches are filled with charts showing exponential growth in token generation. This view is shared by major cloud providers (AWS, Azure, GCP) who sell compute by the hour. Their incentive is to keep the 'token firehose' flowing.
Robin Li (Baidu) - The DAA Advocate: Li is taking a contrarian stance. Baidu has its own LLM, ERNIE 4.0, but it is not competing on raw model size against GPT-4 or Claude. Instead, Baidu is aggressively building an agent ecosystem. Its strategy involves:
1. Embedding Agents into Existing Products: Baidu Search, Maps, and Cloud are being rebuilt around agentic interfaces. A user can ask the Baidu Maps agent to 'plan a weekend trip to Hangzhou with restaurants and hotels under $500,' and the agent will execute a multi-step workflow.
2. Launching an Agent Platform: Baidu has released a platform for third-party developers to build and deploy their own agents on Baidu's ecosystem, similar to a 'Shopify for AI agents.' The success of this platform is measured by DAA.
3. Focusing on Vertical Agents: Baidu is targeting specific industries like healthcare (AI medical assistant), finance (AI investment advisor), and education (AI tutor), where task completion is more valuable than generic text generation.
Case Study: The Failure of 'Chatbot' as a Product: The industry's first wave of consumer AI was the general-purpose chatbot (ChatGPT, Bard, etc.). While they achieved massive user numbers, their DAA (daily active users) as a percentage of total users has shown signs of plateauing. Users chat for a while, get bored, and leave. The agent model aims to fix this by making the AI indispensable for specific, recurring tasks. For example, a 'personal finance agent' that automatically tracks spending, pays bills, and optimizes investments has a much higher potential for daily stickiness than a chatbot that writes poems.
Data Table: Comparing AI Product Strategies
| Company | Core AI Product | Primary Metric | Business Model | Key Risk |
|---|---|---|---|---|
| OpenAI | ChatGPT (Chatbot) + API | Tokens/API calls | Subscription + Usage-based | Commoditization of chat; low user retention |
| NVIDIA | GPUs (H100, B200) | FLOPS, Token throughput | Hardware sales | Demand shift to inference; custom AI chips |
| Baidu | ERNIE Agent Platform | DAA | Platform fees, ad revenue, cloud services | Agent reliability; user trust; competition from other platforms |
| Microsoft | Copilot (Embedded Agent) | Daily Active Users (DAU) | Subscription (M365) | Integration complexity; user training |
Data Takeaway: The table shows that Baidu's bet on DAA is a high-risk, high-reward platform play. Unlike OpenAI, which sells a commodity (tokens), or NVIDIA, which sells a necessity (compute), Baidu is trying to build a new marketplace for agent services. Its success depends on creating a network effect where more agents attract more users, which in turn attracts more developers.
Industry Impact & Market Dynamics
If Baidu's DAA-centric strategy proves successful, it will reshape the AI industry in several ways.
1. The Devaluation of Raw Model Performance: If the market shifts to DAA, the 'model arms race' (who has the best MMLU score) becomes less important. A slightly less capable model that is cheaper, faster, and easier to integrate into an agent framework could win. This would be a blow to companies that have invested billions in training frontier models.
2. The Rise of the 'Agent Middleware' Layer: A new category of software will emerge that sits between the LLM and the end-user. This includes agent orchestration frameworks (LangChain, CrewAI), memory databases (Chroma, Pinecone), and tool integration platforms (Zapier for AI). This layer will capture significant value.
3. New Business Models: Instead of paying per token, users may pay per task completed (e.g., $0.10 per successful flight booking) or a flat monthly subscription for a suite of personal agents. This aligns costs with user-perceived value.
4. Impact on Hardware: A DAA-centric world could shift demand from training GPUs to inference GPUs. It also increases the importance of low-latency, low-power inference chips, potentially benefiting companies like Groq or Apple's Neural Engine over NVIDIA's data-center behemoths.
Data Table: Market Projections for AI Agent Economy
| Metric | 2024 (Estimate) | 2027 (Projection) | Source of Trend |
|---|---|---|---|
| Global AI Agent Market Size | $5 Billion | $30 Billion | Industry analyst consensus |
| % of AI Spend on Agents | 10% | 40% | Shift from model training to application |
| Avg. DAA per User (Consumer) | 0.1 (e.g., using ChatGPT once a week) | 3 (e.g., using a finance, travel, and health agent daily) | Based on mobile app adoption curves |
| Cost per Agent Task | $0.05 (highly variable) | $0.01 (due to optimization and competition) | Moore's Law for inference |
Data Takeaway: The projected growth of the AI agent market from $5B to $30B in three years is a strong signal that the industry is betting on the DAA-centric model. The key inflection point will be when the average user has more than one daily agent interaction—that's when the platform wars will truly begin.
Risks, Limitations & Open Questions
The DAA-centric view is not without its flaws and risks.
1. The 'Hype Cycle' Risk: 'AI Agents' are currently in a state of extreme hype. Many current agents are unreliable, slow, and prone to catastrophic errors. If early user experiences are poor, the DAA metric could plummet, discrediting the entire approach.
2. The 'Cold Start' Problem: A platform is only valuable if it has both users and agents. Baidu must solve this chicken-and-egg problem. Without a killer agent, users won't come. Without users, developers won't build agents.
3. The 'Black Box' Agent Problem: When an agent fails, it's hard to debug. Was the LLM's reasoning flawed? Did it call the wrong API? Did the external service change? This makes building reliable agents extremely difficult.
4. Ethical & Safety Concerns: Autonomous agents that can book travel, spend money, or post on social media introduce significant risks. A poorly designed agent could cause real-world harm. The industry lacks robust safety standards for agentic AI.
5. The 'Jensen' Counter-Argument: NVIDIA's view is that the demand for tokens is insatiable. Even if the world moves to agents, those agents will be powered by even larger, more capable models that generate even more tokens. In this view, DAA is just a different way of consuming tokens, not a replacement for them.
AINews Verdict & Predictions
The Token vs. DAA debate is a false dichotomy in the long run, but a crucial strategic choice in the short term. Baidu's pivot to DAA is a brilliant, high-risk bet that correctly identifies the next phase of AI commoditization.
Our Predictions:
1. DAA will become a standard industry metric within 24 months. Just as DAU (Daily Active Users) became the standard for social media, DAA will become the standard for AI applications. Companies that fail to report it will be seen as opaque.
2. The 'Agent Platform' will be the most valuable slice of the AI stack. The company that owns the platform where developers build and users discover agents will capture the majority of the value. Baidu, with its massive existing user base in China, is well-positioned to do this in its home market. The equivalent in the West is still up for grabs.
3. NVIDIA will eventually adapt. While Jensen Huang will continue to sell the 'token firehose' narrative, NVIDIA's product roadmap will increasingly focus on inference efficiency and agent-specific workloads. Expect to see 'Agent-Optimized' GPU SKUs.
4. The 'Killer Agent' will not be a generalist. It will be a vertical specialist (e.g., 'the best AI accountant for freelancers'). The DAA metric will be won in niches, not in broad general-purpose chatbots.
What to Watch Next:
* Baidu's Q3 2025 Earnings: The first time they formally report DAA as a key metric. The market's reaction will be telling.
* OpenAI's Response: Will they launch an agent platform? Or double down on the API token model?
* The Success of Agent Benchmarks: Can the community create a reliable, standardized benchmark for agent performance that correlates with DAA?
The industry is moving from asking 'How much can it compute?' to 'What can it do for me?' Baidu is betting the company on the latter question. It may be the smartest bet in AI right now.