Connect with us

Business

The Agentic Future: How DaveAI is reimagining human-AI interaction

Published

on

Dr. Ananthakrishnan Gopal, CTO & Co-Founder, DaveAI
Responses by Dr. Ananthakrishnan Gopal, CTO & Co-Founder, DaveAI

The way we interact with technology is undergoing a fundamental shift. From the early days of point-and-click interfaces to voice assistants and now intelligent agents, user experience is no longer confined to screens or static menus. At the center of this shift is DaveAI, a company redefining how people engage with digital platforms through agent-based interactions that feel more human, adaptive, and intuitive.

As industries seek smarter, more scalable ways to deliver personalized service, DaveAI is pushing the boundaries of what’s possible, building agents that not only respond to user input but understand context, solve problems, and even initiate actions. Whether in retail, automotive, or financial services, these AI-driven collaborators are reshaping the way customers discover, decide, and interact.

In this exclusive conversation, we speak with a senior voice at DaveAI, Dr. Ananth G to understand their vision, challenges, and the real-world applications of their intelligent agent technology. From design frameworks to ethical considerations and the future of apps, here’s how DaveAI is leading the next evolution in AI-powered engagement.

DaveAI has been at the forefront of experiential commerce. How are you redefining AI interactions by focusing on intelligent agents over traditional interfaces?

Dr. Ananth G: In the past, Graphical User Interfaces and Mobile Applications became the primary mode of interaction. With the improvement in the AI models, understanding natural language has become more and more effective. The new interface is “Tell me what you want and I will get it done”. Starting with Siri, Alexa etc., the newest generation of interfaces are groundbreakingly intuitive. Intelligent agents, are able to interpret a text or voice input in order to break down the request into tasks. Hence something which would require the user to be familiar with the user interface before being able to get something done has changed so that they can just ask the app what they need and get it done, instead of being familiar with all the menus and buttons etc.

In a world full of touchpoints, why do you believe agent-based design is a more sustainable and effective path forward for customer experience?

Dr. Ananth G: I believe that agent-based design is filling a gap which existed in customer experience. And agent can get something done for the customer 24×7, answer their queries and resolve around 70 to 80% of the customer’s needs today. There is, however, no evidence that agent-based customer experience will completely replace any of the existing touchpoints. It is definitely cheaper and effective for a large swathe of customer experience problems.

Can you share an example where a DaveAI agent went beyond scripted interactions to demonstrate autonomous decision-making or contextual adaptation?

Dr. Ananth G: The best example we have is that of a customer service bot we have created for an electric vehicle manufacturer. Our agent is able to interface with their diagnostic APIs and resolve problems that the user was facing. This includes asking questions, calling the diagnostic APIs, giving the user instructions to follow as well as running over-the-air updates to the eV software.

What technical or design frameworks guide DaveAI when building agents that feel less like chatbots and more like intelligent collaborators?

Dr. Ananth G: We have a proprietary micro-services architecture called GRYD, which natively allows agentic framework implementations. We are able to call multiple agents, with specializations in tasks. Finally, taking in the inputs from all the individual agents we are able to make complex decisions and make the interactions more human-like.

What are the biggest challenges in building AI agents that can learn, adapt, and behave consistently across industries like retail, automotive, or BFSI?

Dr. Ananth G: Among the biggest challenges is customer education. Many of the enterprises and their management perceive AI agents as perfect and all-knowledgeable. They expect the AI to know about the enterprise’s proprietary processes, without taking the time and effort to train the AI with that information. Another aspect is that consistency in AI is still a large problem. We need to tread a thin line between being able to provide human like responses, while still following all the standards and caveats of the enterprise. On one hand, if there are too many restrictions, the responses and interactions appear to be canned. But on the other hand, there is a likelihood of errors or hallucinations if sufficient restrictions are not placed on the agent.

With generative AI capabilities growing, how is DaveAI leveraging them to empower its agents with better reasoning, creativity, or personalization?

Dr. Ananth G: We have experimented with both Open source as well as API based models. We are also in the process of developing domain specific models for providing reasoning and personalization in specific industries.

Can you talk about any new projects or prototypes where DaveAI is experimenting with embodied AI or multimodal agents—moving beyond voice and text alone?

Dr. Ananth G: Yes, multimodal agents are the future for DaveAI. We are piloting two projects, one with a large private sector bank and another with an insurance provider. Both projects are showing promising results, showing a higher level of engagement and hitting better KPIs compared to text-based agents.

How does designing agents shift your approach to data, privacy, and ethical AI? Are there new guardrails needed as agent autonomy increases?

Dr. Ananth G: Absolutely, guard rails are very important. Our approach is to make it a layered approach. At the first layer we have privacy, second data security and third layer is data sourcing and attribution.

Looking ahead 2–3 years, what is your vision for agent-led commerce or service delivery? Will we still need apps and interfaces—or will agents take the lead entirely?

Dr. Ananth G: I think there will definitely be a lot of apps being replaced by agentic microbots. However, not all app interfaces can be effectively replaced by a typing interface. Apps which are highly visual, and need to showcase the product will still be required. If you can do something in a traditional GUI based App with a couple of taps, people wouldn’t want to type or speak a query to achieve the same. I believe a new type of interface will evolve which will use the best of both worlds, where the repeated actions of scrolling and selecting will still dominate, but the world of menus, pages, settings etc. will be replaced.

We're leading interview-based website for Indian and international personalities in various fields, including showbiz, business, lifestyle and technology.

Trending