News

Local LLMs like the fine-tuned Llama 3 model offer robust performance for tool calling. Example implementation involves defining a Python function, binding it to the LLM, and testing execution.
Why write SQL queries when you can get an LLM to write the code for you? Query NFL data using querychat, a new chatbot ...
Today I'll be showing you how to build local AI agents using Python. We'll be using Ollama, LangChain, and something called ChromaDB; to act as our vector search database. All of this will be ...