Duration: 55:06
PART 1 — Analytical Summary: Developing Odoo modules using AI 🚀
Context 💼
In this 55-minute technical session, Joseph — an Odoo developer formerly on the Point of Sale team and now part of Odoo’s AI team — delivers a practical guide to building Odoo modules that leverage AI features. The talk clarifies Odoo’s current approach: Odoo is not training its own models; it is a consumer of external LLMs. Today, GPT and Gemini models are supported. The session shows how to use Odoo’s AI building blocks to implement a concrete enhancement in the Helpdesk to Projects workflow, with careful attention to tool calling, prompts, structured responses, and developer best practices.
Core ideas & innovations 🧠
Joseph frames two core building blocks in Odoo’s AI stack:
-
At a low level, the request_llm function (from the AI module’s LLM API service) runs a query-processing loop that supports LLM tool calling. Developers define the model, prompts, tools, and a return schema, then Odoo orchestrates calls to the AI provider, executes server-side tools upon AI request, and feeds results back into the loop. An optimization flag enables an “end_message” short-circuit to avoid unnecessary final API calls.
-
At a higher level, the AI Agent model wraps more context and configuration, including system prompts, topics, sources, and RAG (retrieval-augmented generation). The agent’s generate_response uses embeddings to retrieve relevant document chunks from provided sources and includes them in the LLM context. Topics can bundle tools; server actions can be exposed as tools. While powerful, the talk focuses on building with the lower-level request_llm for clarity.
A key concept is that tools are defined server-side (name, description, parameter schema, and logic) and are executed by Odoo when the LLM requests them. The return schema is also developer-defined, enabling structured outputs rather than free text. This is crucial for predictable automation inside business workflows.
A practical build: auto-assignment when converting Helpdesk tickets to Project tasks ⚙️
The demo feature improves the “Convert to Task” wizard in Helpdesk by adding an “Auto assign with AI” action. The goal: automatically choose the most appropriate Project (and later, assignee, deadline, and priority) based on the ticket’s subject, description, and tags.
The first iteration passes:
- A system prompt (“You are an assistant helping categorize support tickets…”) and a user prompt containing the ticket details and an efficient table representation of available projects (tables save tokens compared to JSON).
- A JSON schema instructing the LLM to return the chosen project_id, reasoning, and a confidence score.
The result is parsed and validated server-side before updating the wizard and notifying the user. Joseph underscores a best practice: always keep a human-in-the-loop for verification; do not blindly trust LLM “confidence” scores.
The second iteration introduces tools to go beyond project selection:
- A tool to fetch ticket SLA details (deadline pressure, priority) and
- A tool to fetch project users’ workload to assign the least-loaded user.
The system prompt is updated to describe available tools and expected workflow; the response schema grows to include user_id, deadline, and priority. The result is parsed and assigned to wizard fields, with necessary conversions (for dates, etc.).
In the final refinement, Joseph removes the SLA tool: since the ticket ID and SLA derivation are already known at the time of clicking, those values should be computed immediately and injected directly into the prompt/context — not fetched via tool calling. This reduces unnecessary AI calls and latency. Only the truly dynamic tool (user workload per chosen project) remains. The reasoning illustrates a key design principle: only expose tools when the AI truly needs to decide when and how to use them.
Impact & takeaways 💬
This session demystifies how to add AI to Odoo features with a production-minded mindset:
- The request_llm loop offers precise control over model, prompts, tools, and schemas, enabling robust, structured automations.
- The AI Agent and RAG abstractions streamline context-building from documents and topics, useful when you want configurable behavior and knowledge grounding.
- Design carefully which data to compute upfront versus what to expose as tools; fewer tool calls means less latency and lower cost.
- Use tables to reduce token usage; define schemas to standardize outputs; always validate IDs and handle parsing errors.
- Keep human verification steps for risky actions and avoid over-reliance on LLM-reported confidence.
Q&A highlights and developer notes 🧩
Joseph addressed important practicalities:
- Odoo currently supports GPT and Gemini. No in-house training; Odoo is a consumer of external LLMs.
- Tooling is conceptually compatible with MCP; Odoo isn’t running MCP end-to-end today. Batch tool calls are possible but require explicit prompting.
- Agent-to-agent interactions aren’t supported yet.
- Governance and safety: keep a verification step before applying AI suggestions to live records; be mindful of permissions, company boundaries, and data sensitivity (GDPR).
- Logging: tool usage is logged; token counts are estimates. Proper error handling around schemas and parsing is the developer’s responsibility.
- Billing: Odoo SaaS currently uses Odoo’s key; users can also bring their own API keys and bear associated costs.
- Prerequisites: PGVector is required for embeddings; vectors are stored in the Odoo database on-prem.
- Documentation: “extensive” dev docs are not yet available; more to come.
Net effect: Developers can already ship helpful, human-verified automations across Helpdesk, Projects, and beyond — today — using standard LLMs, structured outputs, and judicious tool design. The approach balances simplicity and control while staying mindful of scalability, compliance, and UX.
PART 2 — Viewpoint: Odoo Perspective
Disclaimer: AI-generated creative perspective inspired by Odoo's vision.
What I love here is the pragmatism: keep it simple, integrate deeply, and let the platform do the heavy lifting. With tools, schemas, and RAG, you can shape AI to fit real business workflows — not as a novelty, but as an invisible helper that shortens clicks and reduces errors.
The key is a thoughtful boundary: compute what you already know, expose only the tools that matter, and always keep a human step where the stakes are high. When we make AI feel native to Odoo — respecting permissions, projects, and processes — we deliver value that compounds across apps and the community can build on it.
PART 3 — Viewpoint: Competitors (SAP / Microsoft / Others)
Disclaimer: AI-generated fictional commentary. Not an official corporate statement.
Odoo’s developer ergonomics around tool calling and structured outputs are compelling, especially for SMB and midmarket teams seeking quick wins. The integration across Helpdesk, Projects, and shared data models gives AI context out of the box. The human-in-the-loop stance is sensible given LLM variability.
The challenge at scale is governance: rigorous audit trails, policy controls across multi-company boundaries, regional data residency, and industry-specific compliance. Enterprise buyers will expect robust evaluation frameworks, deterministic fallbacks, and fine-grained separation of duties. If Odoo continues to mature logging, cost controls, and compliance posture — while maintaining its excellent UX velocity — it becomes a credible, differentiated AI platform for business operations.
Disclaimer: This article contains AI-generated summaries and fictionalized commentaries for illustrative purposes. Viewpoints labeled as "Odoo Perspective" or "Competitors" are simulated and do not represent any real statements or positions. All product names and trademarks belong to their respective owners.