this post was submitted on 11 Mar 2026
16 points (90.0% liked)

homeassistant

18940 readers
20 users here now

Home Assistant is open source home automation that puts local control and privacy first.
Powered by a worldwide community of tinkerers and DIY enthusiasts.

Home Assistant can be self-installed on ProxMox, Raspberry Pi, or even purchased pre-installed: Home Assistant: Installation

Discussion of Home-Assistant adjacent topics is absolutely fine, within reason.
If you're not sure, DM @GreatAlbatross@feddit.uk

founded 2 years ago
MODERATORS
 

I mostly understand the dilemma, but I want to see if someone has better success with their AI assistant. I use the Ollama integration and set up a conversation model. However, the default Home Assistant AI knows to use the home forecast entity whenever I ask about the weather. Whether I also set up an AI task model, toggle “control Home Assistant” on or off, or toggle “perform local commands” on or off - the Ollama models do not reference the home forecast the way the default Home Assistant can. I thought maybe keeping default commands on would enable this ability while answering all other queries with the Ollama LLM. I just want a smarter AI. Any suggestions?

you are viewing a single comment's thread
view the rest of the comments
[–] hendrik@palaver.p3x.de 2 points 2 weeks ago* (last edited 2 weeks ago)

What I do is use externed_openai_conversation from the HACS to hook into my LLM's OpenAI-compatible API endpoint. That one makes it available via the regular Voice Assistant stuff within Home Assistant.

Not sure what's happening here. The Ollama page says it doesn't have all functionality, for example it doesn't have sentence triggers? And weather forecast is a bit of a weird one in Home Assistant. That's not an entity (unless you configure one manually) but a service call to fetch the forecast. Maybe your AI just doesn't have the forecast available, just the current condition and maybe current temperature. Everything else must be specifically requested with a deliberate "weather.get_forecast" call. Maybe that service call and the specific processing is in the official Assistant, but not in the Ollama integration?