Large Language Models (LLMs) like OpenAI, Anthropic, or Perplexity have limitations, leading to a need for locally running customizable LLM instances.This guide walks through setting up a local AI environment using the Ollama framework and the OpenWebUI interface.Ollama offers access to numerous open-source models, with phi4-mini being a recommended model to start with.Installing Ollama requires downloading and unzipping the package, while OpenWebUI can be installed using pip.OpenWebUI provides features like web search integration, code interpreter for Python execution, and customization of AI behavior.Web search on OpenWebUI supports engines like Google PSE and Brave Search, enhancing the AI's capabilities.The code interpreter allows the AI to execute Python code within the chat interface, enabling dynamic programming tasks.Customization options in Open WebUI allow users to personalize their AI's behavior for specific use cases.By following this guide, users can create a robust local AI environment comparable to commercial platforms, all under their control.The setup process takes about an hour and grants users a powerful AI tool for tasks like data analysis, programming, and mathematics.Future articles in the series will explore advanced features and functionalities of the local AI setup.