Building a coding assistant using MCP Server and LLM Agent eliminates the tedious and error-prone task of copying and pasting code snippets.
The project aimed to create a coding assistant that directly accesses and reasons over live code content using the MCP and a Large Language Model (LLM) agent.
The Model Context Protocol (MCP) bridges the gap between LLMs and local files/environment by defining a lightweight protocol specification.
MCP Server serves as a gateway exposing tools like read_file and list_directory for LLM agents to interact securely with external data sources.
Custom tools can be added to an MCP Server by writing a Python function and using the @mcp.tool() decorator.
An MCP Server can be implemented as a Python process, as demonstrated in the project using FastMCP.
The project utilizes LangChain and LangGraph to create an agent that orchestrates interactions between the LLM and MCP tools in a ReAct style loop.
The system prompt guides the LLM's behavior, directing coding tasks and tool utilization effectively.
The FastAPI backend manages the connection between the LLM, MCP tools, and ReAct logic, facilitating seamless interaction.
A Streamlit UI provides a user-friendly interface for interacting with the coding assistant and specifying the project's root directory.