Building My First Local AI Agent — No API Keys, No Cloud, Just Me and My M1

I just created my very first local AI agent using:

Here’s what agent-hello does:

# main.py
import requests

def talk_to_llama(prompt):
    res = requests.post("http://localhost:11434/api/generate", json={
        "model": "llama3",
        "prompt": prompt,
        "stream": False
    })
    return res.json()["response"]

print("Agent says:\n", talk_to_llama("Hello, who are you?"))

That’s it. A little hello agent on my M1 Mac. It’s now a sovereign little AI machine.

How I Got It Running
1. Installed Ollama

Ollama Download

2. Pulled a model
ollama pull llama3
3. Set up a Python virtualenv

mkcd agent-hello
python3 -m venv .venv
source .venv/bin/activate
pip install requests
4. Ran python3 main.py — and boom, the agent replied.
python3 main.py
AI Agent with Llama3

Why This Matters

I don’t need an OpenAI key
No usage limits
I can make it do anything: clean folders, write blog posts, summarize PDFs
It runs fully local — and can live on a USB stick if I want

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top