top of page

Basic Chatbot - End to End Agentic AI Chatbot

Problem Statement:

In my previous project, I successfully built an end-to-end chatbot using LLMs, which was able to handle general queries effectively. However, one major limitation was its inability to fetch real-time information from external sources. This highlighted the need for extending the chatbot’s capability beyond static knowledge.

To address this, I have worked on a chatbot with tool integration, where the assistant can intelligently decide whether to respond using pre-trained knowledge or make a tool call via an external API. With this integration, the chatbot can fetch recent  and real-time data, and contextually relevant external content, making it far more dynamic and useful.

The core objective of this project is to demonstrate how LLM-powered assistants can combine internal knowledge with external API integration to provide scalable, intelligent, and context-aware responses. By designing a modular workflow and adding node function implementations, this chatbot project highlights the power of AI workflow automation, web search integration, and hybrid response generation.

Project Details:

 

 

Overview

 

After completing a basic end-to-end chatbot using LLMs, I identified a key limitation—while it could answer general queries, it could not access real-time information. To overcome this, the new project focuses on building a chatbot with tool integration that can leverage external APIs to fetch live, contextual data and enhance the assistant’s capabilities.

 

Application to be Developed

 

​The application to be developed is a workflow-driven AI chatbot capable of handling both knowledge-based queries and real-time information requests. By integrating with the Web Search API, the chatbot will fetch the latest AI news or other dynamic data when required, while still relying on its LLM for general queries. This results in a hybrid chatbot assistant with broader functionality and practical usability.

Technical Approach

 

  • Extend the existing chatbot codebase by integrating a new use case: chatbot with tool.

  • Implement a workflow architecture with nodes and edges to manage tool calls.

  • Use Web Search API for external search requests and integrate API responses as contextual input for the LLM.

  • Ensure conditional execution, where the chatbot only makes a tool call when external information is needed.

  • Maintain a modular architecture that supports future integration of additional APIs and tools.

Deployment

The solution will be deployed in the same front-end framework as the previous chatbot project, with an added option for the new use case. Users will input queries through a text box, and the chatbot will determine whether to respond directly or make a tool call via API. The integration will be tested for scalability, accuracy, and API request limits (500–600 requests/day).

Outcome

The project will deliver a scalable AI chatbot capable of both knowledge-based answers and real-time API-driven responses. This hybrid design demonstrates how LLMs can be enhanced with external tools, providing intelligent, dynamic, and context-aware interactions. The outcome will serve as a practical example of AI workflow automation, API integration, and modular chatbot development for real-world applications.

Final Output:

bottom of page