The semantic layer for Python.
Write what you mean. Python does the rest.
OpenHosta integrates Large Language Models directly into Python as native functions. Define a function with type hints and a docstring — OpenHosta uses AI to implement it. No DSL, no wrappers, just Python.
from OpenHosta import emulate
def translate(text: str, language: str) -> str:
"""Translates the text into the specified language."""
return emulate()
print(translate("Hello World!", "French"))
# 'Bonjour le monde !'OpenHosta also enables semantic testing — evaluate conditions that require cultural knowledge or fuzzy logic, something traditional assert statements can never do:
from OpenHosta import test
sentence = "You are an nice person."
if test(f"this contains an insult: {sentence}"):
print("The sentence is considered an insult.")
else:
print("The sentence is not considered an insult.")
# The sentence is not considered an insult.- Zero DSL — Pure Python syntax. Your functions stay readable, testable, and IDE-friendly.
- Type-safe — Guarded types validate LLM output against your annotations (
int,dict,Enum,Pydantic,Callable…). - Model-agnostic — Works with OpenAI, Ollama, Azure, vLLM — any OpenAI-compatible endpoint.
- Runs offline — Full local execution with Ollama. Your data stays private.
- Production-ready — Uncertainty tracking, cost tracking, audit mode, and async support built-in.
pip install OpenHostaWe recommend using a virtual environment (
python -m venv .venv). See the full installation guide for local model setup, optional dependencies, and troubleshooting.
Ensure you have Ollama installed and run ollama run qwen3.5:4b in your terminal.
from OpenHosta import emulate, OpenAICompatibleModel, config
# 1. Point OpenHosta to your local Ollama instance
local_model = OpenAICompatibleModel(
model_name="qwen3.5:4b",
base_url="http://localhost:11434/v1",
api_key="none" # Ollama does not require a key
)
config.DefaultModel = local_model
# 2. Define and call your function
def translate(text: str, language: str) -> str:
"""Translates the text into the specified language."""
return emulate()
print(translate("Hello World!", "French"))
# 'Bonjour le monde !'Create a .env file in your project directory:
OPENHOSTA_DEFAULT_MODEL_NAME="gpt-4.1"
OPENHOSTA_DEFAULT_MODEL_API_KEY="your-api-key-here"from OpenHosta import emulate
def translate(text: str, language: str) -> str:
"""Translates the text into the specified language."""
return emulate()
print(translate("Hello World!", "French"))
# 'Bonjour le monde !'| Feature | Description |
|---|---|
emulate |
AI-implemented functions from docstrings |
emulate_async |
Non-blocking async variant for concurrency |
emulate_iterator |
Streaming results via lazy generators |
closure |
Semantic lambda functions |
test |
Fuzzy logic / semantic boolean tests |
| Types & Pydantic | int, dict, Enum, dataclass, Pydantic, Callable… |
| Safe Context | Uncertainty tracking & error handling |
| Image input | Pass PIL.Image directly to functions |
📖 Full Documentation · 📝 Changelog · 🧪 Examples
We warmly welcome contributions! Please refer to our Contribution Guide and Code of Conduct.
Browse existing issues to find contribution ideas.
MIT License — see LICENSE for details.
- Emmanuel Batt — Manager and Coordinator, Founder of Hand-e
- William Jolivet — DevOps, SysAdmin
- Léandre Ramos — AI Developer
- Merlin Devillard — UX Designer, Product Owner
GitHub: https://github.com/hand-e-fr/OpenHosta
The future of development is human. — The OpenHosta Team
