By Technology March 10, 2026 · 6 min read

From Assistants to Operators: Enterprise AI Agents in Action

By Emily Friedman

From Assistants to Operators: Enterprise AI Agents in Action
Share
Share

AI agents have seemingly replaced generative AI as the next hot thing in enterprise artificial intelligence. Salesforce, ServiceNow, Microsoft, and many other enterprise software tools all have their own AI agents marketed as helping businesses automate routine tasks and streamline workflows. And while theoretically this should free up employees for more strategic, revenue-generating, and customer-facing work; companies are looking to AI agents to alleviate the skills gap and do more with less human capital.

What is an AI agent? AI agents are software systems or applications that use AI to complete tasks on behalf of users. Often powered by LLMs, AI agents go beyond simple scripts or responses to plan, reason, and use tools like APIs and databases to achieve goals with minimal human input and oversight. AI agents range in complexity, degree of autonomy and ability to learn, with the most advanced agentic AI systems capable of managing entire multstep workflows. 

Most AI agents are examples of composite AI or multi-agent systems, combining various AI techniques and data technologies to break down complex problems into manageable sub-tasks with each step handled by specialized components or sub-agents (e.g. a LLM for reasoning, a search tool for information retrieval, etc.) Today’s more advanced AI deployments consist of multiple autonomous agents working together. Check out 3 real-world examples of AI agents in industry

An Internal AI Agent for the UBER Finance Team

Finch is an internal AI agent used by Uber’s finance and accounting teams. Integrated directly into Slack, Finch streamlines financial data retrieval and removes the need for manual SQL queries and data requests. (SQL is a specialized programming language for managing and retrieving data from relational databases.) Finch delivers real-time, secure and accurate financial insights, accelerating analysis and allowing finance teams to focus on strategy.

Prior to Finch, finance team members had to manually search for the right data by logging into multiple platforms and writing complex SQL queries. Bottlenecks included not having the right permissions (in which case one had to submit a request to the data science team) and having to cross reference documentation and troubleshoot syntax errors.  

Finch has a “multi-agent architecture,” so when an accountant or analyst asks a question in Slack, an initial “Supervisor Agent” routes the query to the appropriate sub-agent. The Supervisor Agent acts as orchestrator. Specialized sub-agents include a “SQL Writer Agent,” a web search agent, a data visualization agent, and a MDX writer agent. 

Finch combines several technologies, including a conversational AI interface connected to Slack via API, retrieval-augmented generation or RAG (an AI technique that boosts LLMs by connecting them to external knowledge bases), and self-querying agents. It’s integrated with Uber’s internal data platforms like Presto, IBM Planning Analytics, and Oracle EPM. 

Finch boasts natural and Uber-specific language understanding (it can map internal finance terminology to structured data sources), metadata-driven query generation (it can fetch relevant metadata and construct and execute precise SQL queries), and secure data access (including built-in role-based permissions to protect sensitive financial data). Results – SQL outputs – are formatted and delivered back to Slack with comments and automatic export to Google Sheets. 

A tool like Finch requires rigorous performance testing and continuous optimization to maintain accuracy. Uber employs end-to-end validation using simulated queries to ensure reliability and regression testing to detect drifts. (Drift refers to the degradation of an AI model’s performance over time as real-world data diverges from the original training data.)

Agentic AI Tools for EBAY Sellers and Shoppers

eBay offers a number of AI tools to speed up and simplify the seller process. The e-commerce giant has been rolling out AI-powered seller features – not exactly AI agents – since 2023 in a bid to attract/retain sellers and boost inventory as it competes with the likes of Poshmark, Etsy, and Amazon.

In 2024, for instance, eBay introduced an AI-powered bulk listing tool that can create multiple listings from photos. Another tool announced in 2025 helps sellers quickly reply to buyer questions, drawing responses from live listings. Sellers are able to review and refine each message before sending. 

eBay is aware that many of its sellers are resource-strapped individuals (as opposed to brands) who may be skeptical of AI. Sellers don’t have to use AI, but the tools are available and regularly tested and refined based on seller suggestions, customer service calls, and even Reddit forums.

Though not exactly AI agents, eBay’s various AI seller tools are examples of generative and conversational AI and part of the company’s push into agentic AI. Last year, eBay shifted from experimental chatbots to more autonomous AI assistants with its “AI shopping agent” for buyers.

Combining agentic AI with eBay’s “scale, infrastructure, decades of customer insights, and product knowledge,” the AI shopping agent personalizes product discovery through conversational prompts. “eBay.ai” finds items based on user requests and personal shopping preferences, appearing when prompted or through predictive messaging on the current webpage, and was initially rolled out to a small percentage of U.S. customers. 

eBay envisions a suite of agentic AI tools that sell items, write code, and even create marketing campaigns. To that end, the online marketplace has created its own “agent framework” capable of using several LLMs in the background. The framework acts as an orchestrator, deciding which AI models to use for specific tasks. eBay also has an internal “Responsible AI” team. 

AI Employee Concierge & Building AI Infrastructure at DEUTSCHE TELEKOM

Europe’s largest telecom has been doing the work of building the infrastructure for AI since 2023. 

This video from last year highlights “askT,” an AI agent or “employee concierge” built to assist Deutsche Telekom (DT) employees with questions about internal policies and benefits as well as product and service inquiries. 

As you can imagine, there are thousands of fragmented knowledge bases across Deutsche Telekom. DT built or trained its own LLM* on multiple knowledge bases (and continues to add more), resulting in a powerful, time-saving, and company-specific information retrieval tool that enhances workflows and empowers employees to provide better customer service.

(*A large language model or LLM is an advanced deep learning model trained on vast and diverse datasets to understand, summarize, generate, and predict content.)

AskT combines conversational AI, retrieval-augmented generation (RAG), and other technologies to help DT employees across Germany handle routine tasks from submitting vacation requests to HR to comparing rate plans on customer support calls. 

To support its long-term AI efforts which include deploying AI agents across geographies, Deutsche Telekom built the Language Model Operating System (LMOS), a “multi-agent PaaS” for building and scaling AI agents. Rather than force AI into fragmented workflows, DT engineers are able to build AI agents using the “APIs and libraries they already know” and easily collaborate with business teams to customize them. 

At the core of LMOS is Arc, a Kotlin*-based framework for defining agent behavior using concise, domain-specific language (DSL). Agent Definition Language (ADL) allows the business side to “chime in” to define agent logic and workflows. (*Kotlin is a programming language.)

Since agent responses depend on domain knowledge buried in documentation and internal data sources, RAG needed to come into play, too. DT selected Qdrant, an open-source vector database and search engine* allowing segmentation by country, domain and agent type. Wurzel, another open-source framework, supports information retrieval at scale, making RAG pipelines “reusable, consistent, and easy to maintain across diverse teams and markets.” 

(In machine learning, vectors or embeddings are numerical representations of complex data like text, images and audio. Vector databases store information as high-dimensional vectors, clustering semantically similar data points together based on proximity. This allows for fast and efficient searches, which is essential for RAG systems.)

Today, the LMOS tech stack “serves as the backbone” for Deutsche Telekom’s AI services, powering askT and customer-facing AI agents like Ask Magenta

What Can We Learn? AI Agent Lessons

As we’ve seen, AI agents can operate behind the scenes or at customer touchpoints. Forward-thinking companies are examining workflows, creating frameworks, and even tying AI agents to specific KPIs.

Internal AI agents at Uber and Deutsche Telekom have resulted in reduced search times, adding up to hours of work saved, while eBay attributes more than 100 million new listings to AI as of last year. Each of the above use cases reveals a best practice: The importance of oversight and validation (Finch); responsible experimentation including end user feedback (Uber); and proceeding with caution by identifying key factors (security, ease of use, fast iteration, multi-tenancy support, cross-organization scaling, etc.) and building an appropriate framework for the future (Deutsche Telekom).

Image source: eBay





 








 


Share
🎯 Augmented Enterprise Summit 2026 — Atlanta, Oct 13-15Apply for a Guest PassRegister Now