diff --git a/CLAUDE.md b/CLAUDE.md index 1ac45c61..04b5d446 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -63,7 +63,7 @@ uv add --dev # Add dev dependency ## 4. Coding Standards -- **Python**: 3.10-3.11, prefer sync for legacy EHR compatibility; async available for modern systems but use only when explicitly needed +- **Python**: 3.10-3.13, prefer sync for legacy EHR compatibility; async available for modern systems but use only when explicitly needed - **Dependencies**: Pydantic v2 (<2.11.0), NumPy <2.0.0 (spaCy compatibility) - **Environment**: Use `uv` to manage dependencies and run commands (`uv run `) - **Formatting**: `ruff` enforces project style @@ -140,9 +140,26 @@ When responding to user instructions, follow this process: - Prefer existing abstractions over new ones - Run: `uv run ruff check . --fix && uv run ruff format .` - If stuck, return to step 3 to re-plan -5. **Review**: Summarize files changed, key design decisions, and any follow-ups or TODOs +5. **Review**: Summarize files changed, key design decisions, and any follow-ups or TODOs. Always check: + - **Tests**: Do any existing tests need updating? Are there gaps worth flagging? + - **Docs**: Do any doc pages, docstrings, or examples need updating? + - **Cookbooks**: Do any cookbook examples need updating to reflect the change? 6. **Session Boundaries**: If request isn't related to current context, suggest starting fresh to avoid confusion +### Committing Changes + +When the developer is ready to commit, AI should: +1. Run the review checklist from step 5 above — flag anything outstanding +2. Run `git status` and `git diff` to identify what's changed +3. Group changes into logical commits if needed (e.g. don't mix a feature with a test infra change) +4. Suggest which files to stage for each commit +5. Propose a commit message following [Conventional Commits](https://www.conventionalcommits.org/) style +6. Let the developer review and run the actual `git commit` themselves — AI never runs `git commit` or `git push` + +### Writing Cookbook Examples + +Follow the principles in [CONTRIBUTING.md — Writing Cookbooks](../CONTRIBUTING.md#writing-cookbooks). Key rule: reduce time-to-running above all else — pre-bake demo data, collapse advanced setup, and lead with the problem the cookbook solves. + ### Adding New FHIR Resource Utilities 1. Check for existing utilities in @healthchain/fhir/ @@ -164,4 +181,4 @@ When responding to user instructions, follow this process: --- -**Last updated**: 2025-12-17 +**Last updated**: 2026-04-20 diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 82ad56c9..39de6ff0 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -78,6 +78,17 @@ When writing docs: - Include code snippets or configuration examples where helpful. - Call out assumptions, limitations, and safety‑relevant behaviour explicitly. +### Writing Cookbooks + +Cookbooks are often the first thing a developer runs. These principles keep them effective: + +- **Reduce time-to-running**: Every prerequisite you can eliminate or defer is a developer you don't lose. Pre-bake demo data and models where possible; collapse advanced setup into `??? details` blocks. +- **Lead with the problem**: The intro should say what pain it solves — "you trained a model on CSVs, now you need to deploy against FHIR data" — not just what the code does. +- **Show HealthChain's unique value**: Each cookbook should have a moment that would be 50+ lines of custom code without HealthChain (`Dataset.from_fhir_bundle()`, `merge_bundles()`, `FHIRAuthConfig.from_env()`). Don't bury it. +- **Complement, don't replace**: Anywhere an existing tool (LangChain, FastAPI, sklearn) appears alongside HealthChain, say explicitly that they work together. Reduces the "should I switch?" anxiety. +- **Be honest about the roadmap**: Compliance, eval, and audit features are in progress — reference them as such. Developers trust you more for it. +- **Collapse advanced paths, don't omit them**: The `??? details` pattern keeps the main path clean without losing information for power users. + ## 💻 Writing Code >**New to HealthChain?** Look for [`good first issue`](https://github.com/dotimplement/HealthChain/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) and [`help wanted`](https://github.com/dotimplement/HealthChain/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) labels. diff --git a/README.md b/README.md index 48cbbf7a..fe0439cb 100644 --- a/README.md +++ b/README.md @@ -297,4 +297,4 @@ This project builds on [fhir.resources](https://github.com/nazrulworld/fhir.reso [build]: https://github.com/dotimplement/HealthChain/actions?query=branch%3Amain [discord]: https://discord.gg/UQC6uAepUz [substack]: https://jenniferjiangkells.substack.com/ -[claude-md]: CLAUDE.MD +[claude-md]: CLAUDE.md diff --git a/cookbook/cds_discharge_summarizer_hf_chat.py b/cookbook/cds_discharge_summarizer_hf_chat.py index 3bb67805..eb79a76f 100644 --- a/cookbook/cds_discharge_summarizer_hf_chat.py +++ b/cookbook/cds_discharge_summarizer_hf_chat.py @@ -69,7 +69,6 @@ def discharge_summarizer(request: CDSRequest) -> CDSResponse: app = HealthChainAPI( title="Discharge Note Summarizer", description="AI-powered discharge note summarization service", - port=8000, service_type="cds-hooks", ) app.register_service(cds, path="/cds") diff --git a/cookbook/cds_discharge_summarizer_hf_trf.py b/cookbook/cds_discharge_summarizer_hf_trf.py index 0e438105..4d7eb792 100644 --- a/cookbook/cds_discharge_summarizer_hf_trf.py +++ b/cookbook/cds_discharge_summarizer_hf_trf.py @@ -48,7 +48,6 @@ def discharge_summarizer(request: CDSRequest) -> CDSResponse: app = HealthChainAPI( title="Discharge Note Summarizer", description="AI-powered discharge note summarization service", - port=8000, service_type="cds-hooks", ) app.register_service(cds, path="/cds") diff --git a/cookbook/data/medplum_seed.py b/cookbook/data/medplum_seed.py new file mode 100644 index 00000000..bb3b2332 --- /dev/null +++ b/cookbook/data/medplum_seed.py @@ -0,0 +1,148 @@ +#!/usr/bin/env python3 +""" +Seed Medplum with a synthetic patient for FHIR Q&A demos. + +Setup: + 1. Sign up at https://app.medplum.com and create a client application + (Settings → Security → Client Applications → New Client) + 2. Add to .env: + MEDPLUM_CLIENT_ID=your_client_id + MEDPLUM_CLIENT_SECRET=your_client_secret + MEDPLUM_BASE_URL=https://api.medplum.com/fhir/R4 + MEDPLUM_TOKEN_URL=https://api.medplum.com/oauth2/token + +Run: + uv run python cookbook/data/medplum_seed.py + +Output: + Prints the created patient ID — add it to .env as DEMO_PATIENT_ID. +""" + +import httpx +from dotenv import load_dotenv +from urllib.parse import urlparse, urlunparse + +from healthchain.gateway.clients.fhir.base import FHIRAuthConfig +from healthchain.gateway.clients.auth import OAuth2TokenManager + +load_dotenv() + +config = FHIRAuthConfig.from_env("MEDPLUM") +_token_manager = OAuth2TokenManager(config.to_oauth2_config()) + + +def _auth_headers() -> dict: + return { + "Authorization": f"Bearer {_token_manager.get_access_token()}", + "Content-Type": "application/fhir+json", + "Accept": "application/fhir+json", + } + + +def _transaction_url(raw_base_url: str) -> str: + """Normalize base URL so httpx always receives an HTTP(S) endpoint.""" + parsed = urlparse(raw_base_url) + + if parsed.scheme == "fhir": + # Support connection-string style env values: + # fhir://host/path?client_id=...&token_url=... + return urlunparse(("https", parsed.netloc, parsed.path, "", "", "")).rstrip("/") + + if parsed.scheme in {"http", "https"}: + # Drop query params if someone copied a connection-string-like URL into BASE_URL. + return urlunparse( + (parsed.scheme, parsed.netloc, parsed.path, "", "", "") + ).rstrip("/") + + raise ValueError( + f"Unsupported MEDPLUM_BASE_URL scheme '{parsed.scheme or ''}'. " + "Use https://... or fhir://..." + ) + + +PATIENT = { + "resourceType": "Patient", + "name": [{"use": "official", "given": ["Sarah"], "family": "Johnson"}], + "gender": "female", + "birthDate": "1979-03-15", +} + + +def _resources(patient_id: str) -> list: + ref = f"Patient/{patient_id}" + return [ + { + "resourceType": "Condition", + "subject": {"reference": ref}, + "code": { + "coding": [ + { + "system": "http://hl7.org/fhir/sid/icd-10", + "code": "C53.9", + "display": "Malignant neoplasm of cervix uteri, unspecified", + } + ], + "text": "Cervical cancer", + }, + "clinicalStatus": { + "coding": [ + { + "system": "http://terminology.hl7.org/CodeSystem/condition-clinical", + "code": "active", + } + ] + }, + "onsetDateTime": "2025-11-01", + }, + { + "resourceType": "Appointment", + "status": "booked", + "description": "Colposcopy follow-up", + "start": "2026-04-10T10:00:00Z", + "end": "2026-04-10T10:30:00Z", + "participant": [ + { + "actor": {"reference": ref}, + "status": "accepted", + } + ], + }, + { + "resourceType": "CarePlan", + "status": "active", + "intent": "plan", + "subject": {"reference": ref}, + "title": "Cervical Cancer Treatment Plan", + "description": ( + "Stage IB cervical cancer. Treatment: radical hysterectomy followed by " + "adjuvant chemoradiation if margins involved. Monthly monitoring for 2 years. " + "Clinical nurse specialist available Mon-Fri 09:00-17:00, ext. 4821." + ), + }, + ] + + +def main(): + base_url = _transaction_url(config.base_url) + with httpx.Client(timeout=30) as client: + # Step 1: create patient, get real ID + r = client.post(f"{base_url}/Patient", headers=_auth_headers(), json=PATIENT) + r.raise_for_status() + patient_id = r.json()["id"] + + # Step 2: create remaining resources referencing the real patient ID + for resource in _resources(patient_id): + r = client.post( + f"{base_url}/{resource['resourceType']}", + headers=_auth_headers(), + json=resource, + ) + r.raise_for_status() + + print("Seeded patient: Sarah Johnson (cervical cancer, stage IB)") + print(f"Patient ID: {patient_id}") + print(f"\nAdd to .env:\n DEMO_PATIENT_ID={patient_id}") + + +if __name__ == "__main__": + main() diff --git a/cookbook/fhir_context_llm_qa.py b/cookbook/fhir_context_llm_qa.py new file mode 100644 index 00000000..8641e603 --- /dev/null +++ b/cookbook/fhir_context_llm_qa.py @@ -0,0 +1,151 @@ +#!/usr/bin/env python3 +""" +FHIR-Grounded Patient Q&A + +Pulls patient data from a FHIR store, formats it as context, and serves +a Q&A endpoint powered by any LangChain-compatible LLM. + +Requirements: + pip install healthchain langchain-core langchain-anthropic python-dotenv + +Setup: + 1. Run: python cookbook/data/medplum_seed.py + 2. Add to .env: + MEDPLUM_CLIENT_ID=your_client_id + MEDPLUM_CLIENT_SECRET=your_client_secret + MEDPLUM_BASE_URL=https://api.medplum.com/fhir/R4 + MEDPLUM_TOKEN_URL=https://api.medplum.com/oauth2/token + ANTHROPIC_API_KEY=your_api_key # or OPENAI_API_KEY, etc. + +Run: + python cookbook/fhir_context_llm_qa.py + # POST /qa {"patient_id": "...", "question": "..."} + # Docs at: http://localhost:8888/docs +""" + +from dotenv import load_dotenv +from pydantic import BaseModel + +from langchain_core.language_models import BaseChatModel +from langchain_core.output_parsers import StrOutputParser +from langchain_core.prompts import ChatPromptTemplate +from langchain_anthropic import ChatAnthropic + +from healthchain.fhir.r4b import Condition, Appointment, CarePlan + +from healthchain.gateway import FHIRGateway, HealthChainAPI +from healthchain.gateway.clients import FHIRAuthConfig +from healthchain.pipeline import Pipeline +from healthchain.io.containers import Document +from healthchain.fhir import merge_bundles + + +load_dotenv() + + +class PatientQuestion(BaseModel): + patient_id: str + question: str + + +class PatientAnswer(BaseModel): + patient_id: str + question: str + answer: str + + +def create_pipeline() -> Pipeline[Document]: + """Format a FHIR patient bundle into a structured LLM context string.""" + pipeline = Pipeline[Document]() + + @pipeline.add_node + def format_context(doc: Document) -> Document: + conditions = doc.fhir.get_resources("Condition") + appointments = doc.fhir.get_resources("Appointment") + careplans = doc.fhir.get_resources("CarePlan") + + lines = ["PATIENT CLINICAL CONTEXT"] + if conditions: + lines.append("\nDiagnoses:") + for c in conditions: + onset = c.onsetDateTime + lines.append( + f" - {c.code.text}" + (f" (since {onset})" if onset else "") + ) + if appointments: + lines.append("\nUpcoming Appointments:") + for a in appointments: + lines.append(f" - {a.description}: {a.start}") + if careplans: + lines.append("\nCare Plan:") + for cp in careplans: + lines.append(f" {cp.description}") + + doc.text = "\n".join(lines) + return doc + + return pipeline + + +def create_chain(llm: BaseChatModel): + """Q&A chain: patient context + question → grounded answer.""" + prompt = ChatPromptTemplate.from_messages( + [ + ( + "system", + "You are a patient information assistant at a hospital. " + "Use the patient's clinical context to give accurate, personalised responses. " + "Do not provide medical advice or diagnoses. " + "Refer clinical questions to the care team.", + ), + ("human", "{context}\n\nPatient question: {question}"), + ] + ) + return prompt | llm | StrOutputParser() + + +def create_app(llm: BaseChatModel) -> HealthChainAPI: + fhir_config = FHIRAuthConfig.from_env("MEDPLUM") + + gateway = FHIRGateway() + gateway.add_source("medplum", fhir_config.to_connection_string()) + + pipeline = create_pipeline() + chain = create_chain(llm) + + app = HealthChainAPI( + title="FHIR-Grounded Patient Q&A", + description="Answers patient questions using live FHIR data as context", + service_type="fhir-gateway", + ) + + @app.post("/qa") + def answer_question(request: PatientQuestion) -> PatientAnswer: + bundles = [] + for resource_type in [Condition, Appointment, CarePlan]: + try: + bundle = gateway.search( + resource_type, {"patient": request.patient_id}, "medplum" + ) + bundles.append(bundle) + except Exception as e: + print(f"Warning: Could not fetch {resource_type.__name__}: {e}") + + doc = Document(data=merge_bundles(bundles)) + + doc = pipeline(doc) + + answer = chain.invoke({"context": doc.text, "question": request.question}) + return PatientAnswer( + patient_id=request.patient_id, + question=request.question, + answer=answer, + ) + + return app + + +if __name__ == "__main__": + llm = ChatAnthropic(model="claude-opus-4-6", max_tokens=512) + app = create_app(llm) + app.run(port=8888) diff --git a/cookbook/multi_ehr_data_aggregation.py b/cookbook/multi_ehr_data_aggregation.py index 5068b7bb..98d6f47e 100644 --- a/cookbook/multi_ehr_data_aggregation.py +++ b/cookbook/multi_ehr_data_aggregation.py @@ -103,7 +103,6 @@ def get_unified_patient(patient_id: str, sources: List[str]) -> Bundle: app = HealthChainAPI( title="Multi-EHR Data Aggregation", description="Aggregate patient data from multiple FHIR sources", - port=8888, service_type="fhir-gateway", ) app.register_gateway(gateway, path="/fhir") @@ -112,4 +111,4 @@ def get_unified_patient(patient_id: str, sources: List[str]) -> Bundle: if __name__ == "__main__": - create_app().run() + create_app().run(port=8888) diff --git a/cookbook/notereader_clinical_coding_fhir.py b/cookbook/notereader_clinical_coding_fhir.py index 0c0c5595..c01e1395 100644 --- a/cookbook/notereader_clinical_coding_fhir.py +++ b/cookbook/notereader_clinical_coding_fhir.py @@ -105,7 +105,6 @@ def ai_coding_workflow(request: CdaRequest): app = HealthChainAPI( title="Epic CDI Service", description="Clinical document intelligence with FHIR and NoteReader integration", - port=8000, service_type="fhir-gateway", ) app.register_gateway(fhir_gateway, path="/fhir") diff --git a/cookbook/sepsis_cds_hooks.py b/cookbook/sepsis_cds_hooks.py index cf1ea7a2..55c1152e 100644 --- a/cookbook/sepsis_cds_hooks.py +++ b/cookbook/sepsis_cds_hooks.py @@ -80,9 +80,6 @@ def sepsis_alert(request: CDSRequest) -> CDSResponse: dataset = Dataset.from_fhir_bundle(bundle, schema=SCHEMA_PATH) result = pipeline(dataset) - # print("Result:") - # print(result.data.head(10)) - probability = float(result.metadata["probabilities"][0]) risk = ( "high" if probability > 0.7 else "moderate" if probability > 0.4 else "low" @@ -118,7 +115,6 @@ def sepsis_alert(request: CDSRequest) -> CDSResponse: app = HealthChainAPI( title="Sepsis CDS Hooks", description="Real-time sepsis risk alerts via CDS Hooks", - port=8000, service_type="cds-hooks", ) app.register_service(cds, path="/cds") diff --git a/docs/cookbook/fhir_qa.md b/docs/cookbook/fhir_qa.md new file mode 100644 index 00000000..953e1269 --- /dev/null +++ b/docs/cookbook/fhir_qa.md @@ -0,0 +1,305 @@ +# FHIR-Grounded Patient Q&A + +This example shows you how to build a Q&A service that answers patient questions using their live clinical data as context. The service fetches FHIR resources from a connected EHR, formats them into a structured prompt context using a HealthChain pipeline, and passes both to an LLM to generate a grounded, personalised response. + +This is the foundational pattern for patient-facing AI assistants — hospital portal chatbots, discharge navigation tools, care plan Q&A — where answers must be anchored to the individual patient's record rather than general medical knowledge. + +Check out the full working example [here](https://github.com/dotimplement/HealthChain/tree/main/cookbook/fhir_context_llm_qa.py)! + +## Setup + +```bash +pip install healthchain langchain-core langchain-anthropic python-dotenv + +# or for HuggingFace models +pip install healthchain langchain-core langchain-huggingface python-dotenv +``` + +We'll use [Medplum](https://www.medplum.com/) as our FHIR sandbox — it lets you seed your own synthetic patients and query them over a standard FHIR R4 API. If you haven't set up Medplum access yet, see the [FHIR Sandbox Setup Guide](./setup_fhir_sandboxes.md#medplum) for step-by-step instructions. + +Once you have your Medplum credentials, add them to a `.env` file: + +```bash +# .env file +MEDPLUM_CLIENT_ID=your_client_id +MEDPLUM_CLIENT_SECRET=your_client_secret +MEDPLUM_BASE_URL=https://api.medplum.com/fhir/R4 +MEDPLUM_TOKEN_URL=https://api.medplum.com/oauth2/token +ANTHROPIC_API_KEY=your_api_key # or OPENAI_API_KEY, etc. +``` + +### Seed test data + +The cookbook ships with a seed script that creates a synthetic patient with conditions, an upcoming appointment, and an active care plan: + +```bash +uv run python cookbook/data/medplum_seed.py +``` + +The script prints the new patient ID — add it to `.env` so you can reference it when testing: + +```bash +DEMO_PATIENT_ID= +``` + +## Format FHIR data as LLM context + +The first piece is a HealthChain [Pipeline](../reference/pipeline/pipeline.md) that transforms a FHIR Bundle into a structured plain-text context block. This is a deliberate design choice: the LLM never sees raw FHIR JSON. Instead, you control exactly what clinical information is surfaced and how it's phrased. + +```python +from healthchain.pipeline import Pipeline +from healthchain.io.containers import Document + +def create_pipeline() -> Pipeline[Document]: + pipeline = Pipeline[Document]() + + @pipeline.add_node + def format_context(doc: Document) -> Document: + conditions = doc.fhir.get_resources("Condition") + appointments = doc.fhir.get_resources("Appointment") + careplans = doc.fhir.get_resources("CarePlan") + + lines = ["PATIENT CLINICAL CONTEXT"] + if conditions: + lines.append("\nDiagnoses:") + for c in conditions: + onset = c.onsetDateTime + lines.append( + f" - {c.code.text}" + (f" (since {onset})" if onset else "") + ) + if appointments: + lines.append("\nUpcoming Appointments:") + for a in appointments: + lines.append(f" - {a.description}: {a.start}") + if careplans: + lines.append("\nCare Plan:") + for cp in careplans: + lines.append(f" {cp.description}") + + doc.text = "\n".join(lines) + return doc + + return pipeline +``` + +When you initialize a [Document](../reference/io/containers/document.md) with a FHIR Bundle, it automatically extracts resources by type so you can query them directly: + +```python +doc = Document(data=bundle) + +doc.fhir.get_resources("Condition") # List[Condition] +doc.fhir.get_resources("Appointment") # List[Appointment] +doc.fhir.get_resources("CarePlan") # List[CarePlan] +``` + +After the pipeline runs, `doc.text` holds the formatted context string ready to inject into the LLM prompt. + +!!! tip "Customising context" + + What you include here directly shapes response quality. Common additions: + + - **Medications** — `doc.fhir.get_resources("MedicationRequest")` + - **Recent results** — `doc.fhir.get_resources("Observation")` + - **Discharge letters** — `doc.fhir.get_resources("DocumentReference")` + + For sensitive resources (mental health, substance use), apply consent-based filtering before adding them to context. + +## Build the Q&A chain + +The second piece is a LangChain chain that takes the formatted context and the patient's question and returns a grounded answer. The system prompt sets the scope: answer from the patient's record, don't provide medical diagnoses, refer clinical questions to the care team. + +```python +from langchain_core.language_models import BaseChatModel +from langchain_core.output_parsers import StrOutputParser +from langchain_core.prompts import ChatPromptTemplate + +def create_chain(llm: BaseChatModel): + prompt = ChatPromptTemplate.from_messages( + [ + ( + "system", + "You are a patient information assistant at a hospital. " + "Use the patient's clinical context to give accurate, personalised responses. " + "Do not provide medical advice or diagnoses. " + "Refer clinical questions to the care team.", + ), + ("human", "{context}\n\nPatient question: {question}"), + ] + ) + return prompt | llm | StrOutputParser() +``` + +Any LangChain-compatible LLM works here — swap `ChatAnthropic` for a HuggingFace model or any other provider without changing the pipeline or gateway: + +```python +from langchain_huggingface.llms import HuggingFaceEndpoint +from langchain_huggingface import ChatHuggingFace + +hf = HuggingFaceEndpoint( + repo_id="mistralai/Mistral-7B-Instruct-v0.3", + task="text-generation", + max_new_tokens=512, +) +llm = ChatHuggingFace(llm=hf) +app = create_app(llm) +``` + +Set `HUGGINGFACEHUB_API_TOKEN` in your `.env` file to authenticate. + +!!! note "HealthChain complements your existing stack" + + HealthChain handles the healthcare-specific plumbing: FHIR authentication, resource fetching, context formatting, and deployment scaffolding. Your LangChain chains, prompts, and LLM choices stay exactly as they are. If you're already using FastAPI, `HealthChainAPI` is a thin wrapper that adds FHIR-aware routing and auto-generated OpenAPI docs on top — you're not replacing anything. + +## Build the service + +Wire the gateway, pipeline, and chain together into a [HealthChainAPI](../reference/gateway/api.md) service with a single `/qa` endpoint: + +```python +from pydantic import BaseModel +from healthchain.fhir.r4b import Condition, Appointment, CarePlan +from healthchain.gateway import FHIRGateway, HealthChainAPI +from healthchain.gateway.clients import FHIRAuthConfig +from healthchain.fhir import merge_bundles + +class PatientQuestion(BaseModel): + patient_id: str + question: str + +class PatientAnswer(BaseModel): + patient_id: str + question: str + answer: str + +def create_app(llm: BaseChatModel) -> HealthChainAPI: + fhir_config = FHIRAuthConfig.from_env("MEDPLUM") + gateway = FHIRGateway() + gateway.add_source("medplum", fhir_config.to_connection_string()) + + pipeline = create_pipeline() + chain = create_chain(llm) + + app = HealthChainAPI( + title="FHIR-Grounded Patient Q&A", + description="Answers patient questions using live FHIR data as context", + service_type="fhir-gateway", + ) + + @app.post("/qa") + def answer_question(request: PatientQuestion) -> PatientAnswer: + bundles = [] + for resource_type in [Condition, Appointment, CarePlan]: + try: + bundle = gateway.search( + resource_type, {"patient": request.patient_id}, "medplum" + ) + bundles.append(bundle) + except Exception as e: + print(f"Warning: Could not fetch {resource_type.__name__}: {e}") + + doc = Document(data=merge_bundles(bundles)) + doc = pipeline(doc) + + answer = chain.invoke({"context": doc.text, "question": request.question}) + return PatientAnswer( + patient_id=request.patient_id, + question=request.question, + answer=answer, + ) + + return app +``` + +Then run it: + +```python +from langchain_anthropic import ChatAnthropic + +if __name__ == "__main__": + llm = ChatAnthropic(model="claude-opus-4-6", max_tokens=512) + app = create_app(llm) + app.run(port=8888) +``` + +!!! info "How the endpoint works" + + For each `/qa` request, the service: + + 1. Fetches Conditions, Appointments, and CarePlans for the patient from Medplum + 2. Merges them into a single Bundle with `merge_bundles()` + 3. Runs the pipeline to produce a plain-text context string + 4. Calls the LLM chain with the context + question + 5. Returns the answer as a `PatientAnswer` JSON response + +## Test the service + +With the service running at `http://localhost:8888`, use your seeded patient ID from `.env`: + +=== "cURL" + ```bash + curl -X POST http://localhost:8888/qa \ + -H "Content-Type: application/json" \ + -d '{"patient_id": "", "question": "When is my next appointment?"}' + ``` + +=== "Python" + ```python + import requests + + response = requests.post( + "http://localhost:8888/qa", + json={ + "patient_id": "", + "question": "When is my next appointment?", + }, + ) + print(response.json()) + ``` + +Interactive API docs are available at `http://localhost:8888/docs`. + +??? example "Illustrative response" + + ```json + { + "patient_id": "abc123", + "question": "When is my next appointment?", + "answer": "Your next appointment is a Colposcopy follow-up scheduled for 10 April 2026 at 10:00 AM. If you need to reschedule or have questions about what to expect, please contact your care team directly." + } + ``` + + *Output will vary based on your seeded patient data and LLM model.* + +??? warning "Missing resources" + + If a resource type isn't available for a patient, the service logs a warning and continues — partial context is better than an error: + + ``` + Warning: Could not fetch CarePlan: [FHIR request failed: 404] + ``` + + The LLM will answer based on whatever resources were successfully retrieved. + +## What You've Built + +A FHIR-grounded patient Q&A service that: + +- **Fetches live FHIR data** — connects to any FHIR R4 server via the gateway; swap Medplum for Epic, Cerner, or an NHS API by changing the source config +- **Formats context deterministically** — the pipeline controls exactly what the LLM sees; no raw FHIR JSON in prompts +- **Is LLM-agnostic** — any LangChain-compatible model works without changing the pipeline or gateway +- **Handles partial data gracefully** — individual resource failures don't crash the service +- **Exposes a standard REST endpoint** — auto-documented at `/docs`, ready to call from a frontend or other service + +!!! info "Use Cases" + + - **Patient portal chatbots** — answer "what medications am I on?", "when is my next scan?", "what does my care plan say?" directly from the patient's record + - **Discharge navigation** — help patients understand their discharge instructions, follow-up appointments, and care plan actions in plain language + - **Clinical inbox triage** — pre-generate context-aware responses to common patient messages, reducing administrative burden on care teams + - **Care plan explanation** — surface care plan steps in patient-friendly language, personalised to their conditions and appointments + +!!! tip "Next Steps" + + - **Add more resource types**: Extend the pipeline to include `MedicationRequest`, `Observation`, or `DocumentReference` for richer context + - **Swap the LLM**: Replace `ChatAnthropic` with a HuggingFace model (`ChatHuggingFace` + `HuggingFaceEndpoint`) or any other LangChain-compatible provider — the pipeline and gateway are unchanged + - **Connect to a real FHIR source**: Replace Medplum with an Epic or Cerner sandbox — see [Setup FHIR Sandboxes](./setup_fhir_sandboxes.md) for instructions + - **Add conversation history**: Extend `PatientQuestion` with a `history` field and pass it into the LangChain prompt for multi-turn Q&A + - **Go to production**: Scaffold a project with `healthchain new` and run with `healthchain serve` — see [From cookbook to service](./index.md#from-cookbook-to-service). Moving to `healthchain.yaml` is where config-driven compliance support (audit logging, certificates, deployment metadata) will live as those features mature. diff --git a/docs/cookbook/index.md b/docs/cookbook/index.md index da7eb9cb..e0bf3ac0 100644 --- a/docs/cookbook/index.md +++ b/docs/cookbook/index.md @@ -44,6 +44,20 @@ Hands-on, production-ready examples for building healthcare AI applications with + +
💬
+
FHIR-Grounded Patient Q&A
+
+ Build a patient Q&A service that fetches live FHIR data, formats it as LLM context via a pipeline, and returns grounded answers. Foundation pattern for patient portal chatbots and care navigation assistants. +
+
+ GenAI + FHIR + Pipeline + Gateway +
+
+
🔗
Multi-Source Patient Data Aggregation
@@ -132,7 +146,8 @@ gateway.add_source("medplum", FHIRAuthConfig.from_env("MEDPLUM").to_connection_s llm = ChatAnthropic(model="claude-opus-4-6", max_tokens=512) -app = HealthChainAPI(title="My App", port=8000, service_type="fhir-gateway") +app = HealthChainAPI(title="My App", service_type="fhir-gateway") +app.run(port=8000) ``` ```yaml diff --git a/docs/cookbook/ml_model_deployment.md b/docs/cookbook/ml_model_deployment.md index 91f7efdd..a68b78d3 100644 --- a/docs/cookbook/ml_model_deployment.md +++ b/docs/cookbook/ml_model_deployment.md @@ -1,6 +1,6 @@ # Deploy ML Models: Real-Time Alerts & Batch Screening -You trained a model on CSVs. Now you need to deploy it against FHIR data from EHRs. This tutorial shows how to bridge that gap with two production patterns: **real-time CDS Hooks alerts** and **batch FHIR Gateway screening**—both using the same model and a simple YAML schema that maps FHIR resources to your training features. +You trained a model on CSVs. Now you need to deploy it against FHIR data from EHRs. This tutorial shows how to bridge that gap with two production patterns: **real-time CDS Hooks alerts** and **batch FHIR Gateway screening** — both using the same model and a simple YAML schema that maps FHIR resources to your training features. Check out the full working examples: @@ -16,173 +16,35 @@ Check out the full working examples: | **CDS Hooks** | Clinician opens chart | Alert cards in EHR UI | Point-of-care decision support | | **FHIR Gateway** | Scheduled job / API call | [RiskAssessment](https://www.hl7.org/fhir/riskassessment.html) resources | Population screening, quality measures | -Both patterns share the same trained model and feature extraction—only the integration layer differs. +Both patterns share the same trained model and feature extraction — only the integration layer differs. -## Setup - -### Install Dependencies - -```bash -pip install healthchain joblib xgboost scikit-learn python-dotenv -``` +--- -### Train the Model (or Bring Your Own) +## Quick Start: CDS Hooks in 5 Minutes -The cookbook includes a training script that builds an XGBoost classifier from MIMIC-IV data. From the project root: +The model and demo patients are already in the repo — no training or data download needed. ```bash -cd scripts -python sepsis_prediction_training.py +pip install healthchain joblib xgboost +python cookbook/sepsis_cds_hooks.py ``` -This script: - -- Loads MIMIC-IV CSV tables (chartevents, labevents, patients, diagnoses) -- Extracts vitals features (heart rate, temperature, respiratory rate, WBC, lactate, creatinine, age, gender) -- Labels ICU stays with sepsis diagnoses (ICD-9/ICD-10) -- Trains Random Forest, XGBoost, and Logistic Regression models -- Saves the best model (by F1 score) to `scripts/models/sepsis_model.pkl` +That's it. The script starts a local CDS Hooks service, fires test requests against it using three pre-extracted MIMIC patients, and prints risk scores: -After training, copy the model to the cookbook directory: - -```bash -cp scripts/models/sepsis_model.pkl cookbook/models/ ``` - -!!! note "MIMIC-IV Demo Dataset" - - The training script uses the [MIMIC-IV Clinical Database Demo](https://physionet.org/content/mimic-iv-demo/2.2/) (~50MB, freely downloadable). Set the path: - - ```bash - export MIMIC_CSV_PATH=/path/to/mimic-iv-clinical-database-demo-2.2 - ``` - - *This is a quick-start workflow for demo purposes. Full MIMIC requires credentialed access. Most researchers use BigQuery or a PostgreSQL database. - -**Using your own model?** The pipeline is flexible—just save any scikit-learn-compatible model as a pickle with this structure: - -```python -import joblib - -model_data = { - "model": your_trained_model, # Must have .predict_proba() - "metadata": { - "feature_names": ["heart_rate", "temperature", ...], - "metrics": {"optimal_threshold": 0.5} - } -} -joblib.dump(model_data, "cookbook/models/sepsis_model.pkl") +Processed 3 requests + Patient 1: Sepsis Risk: HIGH (85%) + Patient 2: Sepsis Risk: MODERATE (52%) + Patient 3: Low risk (no alert) ``` -The pipeline will work with any model that implements `predict_proba()` - XGBoost, Random Forest, LightGBM, or even PyTorch/TensorFlow models wrapped with a sklearn-compatible interface. - -### Prepare Demo Patient Data - -The two patterns have different data requirements: - -| Pattern | Data Source | What You Need | -|---------|-------------|---------------| -| **CDS Hooks** | Local JSON files | Download pre-extracted patients (quick start) | -| **FHIR Gateway** | FHIR server | Upload patients to Medplum and get server-assigned IDs | - -=== "CDS Hooks Only (Quick Start)" - - Download pre-extracted patient bundles—these are already in the repo if you cloned it: - - ```bash - mkdir -p cookbook/data/mimic_demo_patients - cd cookbook/data/mimic_demo_patients - wget https://github.com/dotimplement/HealthChain/raw/main/cookbook/data/mimic_demo_patients/high_risk_patient.json - wget https://github.com/dotimplement/HealthChain/raw/main/cookbook/data/mimic_demo_patients/moderate_risk_patient.json - wget https://github.com/dotimplement/HealthChain/raw/main/cookbook/data/mimic_demo_patients/low_risk_patient.json - ``` - - That's it! Skip to [Pattern 1: CDS Hooks](#pattern-1-real-time-cds-hooks-alerts). - -=== "FHIR Gateway (Full Setup)" - - The batch screening pattern queries patients from a FHIR server. This tutorial uses [Medplum](https://www.medplum.com/) (a free, hosted FHIR server), but any FHIR R4-compliant API works - just swap the credentials. - - **1. Configure FHIR Credentials** - - Add Medplum credentials to your `.env` file. See [FHIR Sandbox Setup](./setup_fhir_sandboxes.md#medplum) for details: - - ```bash - MEDPLUM_BASE_URL=https://api.medplum.com/fhir/R4 - MEDPLUM_CLIENT_ID=your_client_id - MEDPLUM_CLIENT_SECRET=your_client_secret - MEDPLUM_TOKEN_URL=https://api.medplum.com/oauth2/token - MEDPLUM_SCOPE=openid - ``` - - **2. Extract and Upload Demo Patients** - - ```bash - # Set MIMIC-on-FHIR path (or use --mimic flag) - export MIMIC_FHIR_PATH=/path/to/mimic-iv-on-fhir - - # Extract and upload to Medplum - cd scripts - python extract_mimic_demo_patients.py --minimal --upload - ``` - - This script: - - - Loads patient data from [MIMIC-IV on FHIR](https://physionet.org/content/mimic-iv-demo/2.2/) - - Runs the sepsis model to find high/moderate/low risk patients - - Creates minimal FHIR bundles with only the observations needed - - Uploads them to your Medplum instance as transaction bundles - - **3. Copy Patient IDs** - - After upload, the script prints server-assigned patient IDs: - - ``` - ✓ Uploaded to Medplum! - - Copy this into sepsis_fhir_batch.py: - - DEMO_PATIENT_IDS = [ - "702e11e8-6d21-41dd-9b48-31715fdc0fb1", # high risk - "3b0da7e9-0379-455a-8d35-bedd3a6ee459", # moderate risk - "f490ceb4-6262-4f1e-8b72-5515e6c46741", # low risk - ] - ``` - - Copy these IDs into the `DEMO_PATIENT_IDS` list in `sepsis_fhir_batch.py`. - - !!! tip "Generate More Patients" - - The script has options for generating larger test sets: - - ```bash - python extract_mimic_demo_patients.py --help - - # Examples: - --num-patients-per-risk 5 # 5 patients per risk level (15 total) - --seed 123 # Different random sample - --minimal # Keep only latest observation per feature (~12KB each) - ``` - - !!! tip "Alternative: Manual Upload" - - If you prefer, run without `--upload` to generate bundle JSON files, then upload them manually via the [Medplum → Batch](https://app.medplum.com/batch) page. +Results are saved to `./output/`. The rest of this tutorial explains how it works and how to adapt it. --- -**Setup complete!** You should now have: - -- ✅ A trained model at `cookbook/models/sepsis_model.pkl` -- ✅ Demo patient data (local JSON or uploaded to Medplum) - -If using the **FHIR Gateway pattern**, also confirm: - -- ✅ FHIR credentials in `.env` -- ✅ Patient IDs copied into `DEMO_PATIENT_IDS` in `sepsis_fhir_batch.py` - ## The Shared Model Pipeline -Both patterns reuse the same pipeline. Here's what you'll write: +Both patterns reuse the same pipeline. It loads a pre-trained XGBoost classifier and runs inference on a `Dataset` extracted from a FHIR Bundle: ```python def create_pipeline() -> Pipeline[Dataset]: @@ -203,13 +65,7 @@ def create_pipeline() -> Pipeline[Dataset]: return pipeline ``` -The pipeline operates on a `Dataset`, which you create from a FHIR bundle: - -```python -dataset = Dataset.from_fhir_bundle(bundle, schema=SCHEMA_PATH) -``` - -**How does FHIR become a DataFrame?** The schema maps FHIR resources to your training features: +**How does FHIR become a DataFrame?** A YAML schema maps FHIR resources to your training features: ```yaml # sepsis_vitals.yaml (excerpt) @@ -226,21 +82,62 @@ features: transform: calculate_age ``` -No FHIR parsing code needed—define the mapping once, use it everywhere. +No FHIR parsing code needed — define the mapping once, use it everywhere: + +```python +dataset = Dataset.from_fhir_bundle(bundle, schema=SCHEMA_PATH) +``` !!! tip "Explore Interactively" Step through the full flow in [notebooks/fhir_ml_workflow.ipynb](https://github.com/dotimplement/HealthChain/blob/main/notebooks/fhir_ml_workflow.ipynb): FHIR bundle → Dataset → DataFrame → inference → RiskAssessment. -Now let's see how this pipeline plugs into each deployment pattern. +??? details "Train your own model" + + The cookbook includes a training script that builds an XGBoost classifier from MIMIC-IV data: + + ```bash + cd scripts + python sepsis_prediction_training.py + ``` + + This script loads MIMIC-IV CSV tables, extracts vitals features, labels ICU stays with sepsis diagnoses, trains and evaluates several classifiers, and saves the best model to `scripts/models/sepsis_model.pkl`. + + After training, copy it into the cookbook directory: + + ```bash + cp scripts/models/sepsis_model.pkl cookbook/models/ + ``` + + !!! note "MIMIC-IV Demo Dataset" + + The training script uses the [MIMIC-IV Clinical Database Demo](https://physionet.org/content/mimic-iv-demo/2.2/) (~50MB, freely downloadable): + + ```bash + export MIMIC_CSV_PATH=/path/to/mimic-iv-clinical-database-demo-2.2 + ``` + + **Using your own model?** Save any scikit-learn-compatible model as a pickle with this structure: + + ```python + import joblib + + joblib.dump({ + "model": your_trained_model, # Must have .predict_proba() + "metadata": { + "feature_names": ["heart_rate", "temperature", ...], + "metrics": {"optimal_threshold": 0.5} + } + }, "cookbook/models/sepsis_model.pkl") + ``` + + The pipeline works with any model that implements `predict_proba()` — XGBoost, Random Forest, LightGBM, or PyTorch/TensorFlow wrapped with a sklearn-compatible interface. --- ## Pattern 1: Real-Time CDS Hooks Alerts -Use CDS Hooks when you need **instant alerts** during clinical workflows. The EHR triggers your service and pushes patient data via prefetch—no server queries needed. - -### How It Works +Use CDS Hooks when you need **instant alerts** during clinical workflows. The EHR triggers your service and pushes patient data via prefetch — no server queries needed. ``` Clinician opens chart → EHR fires patient-view hook → Your service runs prediction → CDS card appears in EHR @@ -268,7 +165,6 @@ def sepsis_alert(request: CDSRequest) -> CDSResponse: dataset = Dataset.from_fhir_bundle(bundle, schema=SCHEMA_PATH) result = pipeline(dataset) - # Generate alert card if risk is elevated prob = float(result.metadata["probabilities"][0]) risk = "high" if prob > 0.7 else "moderate" if prob > 0.4 else "low" @@ -285,40 +181,27 @@ def sepsis_alert(request: CDSRequest) -> CDSResponse: return CDSResponse(cards=[]) ``` -### Build the Service +### Build and Test the Service -Register with [HealthChainAPI](../reference/gateway/api.md): +Register with [HealthChainAPI](../reference/gateway/api.md) and test using the [SandboxClient](../reference/utilities/sandbox.md): ```python +from healthchain.gateway import HealthChainAPI +from healthchain.sandbox import SandboxClient + app = HealthChainAPI(title="Sepsis CDS Hooks") app.register_service(cds, path="/cds") -``` - -### Test with Sandbox Client - -The [SandboxClient](../reference/utilities/sandbox.md) simulates EHR requests using your demo patient files: - -```python -from healthchain.sandbox import SandboxClient +# Test with pre-extracted demo patients client = SandboxClient( url="http://localhost:8000/cds/cds-services/sepsis-risk", workflow="patient-view", ) -client.load_from_path("data/mimic_demo_patients", pattern="*_patient.json") +client.load_from_path("cookbook/data/mimic_demo_patients", pattern="*_patient.json") responses = client.send_requests() client.save_results(save_request=True, save_response=True, directory="./output/") ``` -### Expected Output - -``` -Processed 3 requests - Patient 1: Sepsis Risk: HIGH (85%) - Patient 2: Sepsis Risk: MODERATE (52%) - Patient 3: Low risk (no alert) -``` - ??? example "Example CDS Response" ```json @@ -340,100 +223,104 @@ Processed 3 requests --- -## Pattern 2: Batch FHIR Gateway Screening +## Advanced: Batch FHIR Gateway Screening Use the FHIR Gateway when you need to **screen multiple patients** from a FHIR server. Unlike CDS Hooks (ephemeral alerts), this pattern **persists predictions back to the FHIR server** as RiskAssessment resources, making them available for dashboards, reports, and downstream workflows. -### How It Works - ``` Query patients from FHIR server → Run predictions → Write RiskAssessment back to FHIR server ``` -### Set Up FHIR Gateway +**Prerequisites:** A running FHIR server with patient data. This tutorial uses [Medplum](https://www.medplum.com/) — see the [FHIR Sandbox Setup guide](./setup_fhir_sandboxes.md#medplum) to get credentials, then add them to `.env`: + +```bash +MEDPLUM_BASE_URL=https://api.medplum.com/fhir/R4 +MEDPLUM_CLIENT_ID=your_client_id +MEDPLUM_CLIENT_SECRET=your_client_secret +MEDPLUM_TOKEN_URL=https://api.medplum.com/oauth2/token +``` + +??? details "Upload demo patients to Medplum" -Configure the [FHIRGateway](../reference/gateway/fhir_gateway.md) with your FHIR source: + Extract MIMIC patients and upload them to your Medplum instance: + + ```bash + export MIMIC_FHIR_PATH=/path/to/mimic-iv-on-fhir + cd scripts + python extract_mimic_demo_patients.py --minimal --upload + ``` + + The script prints server-assigned patient IDs — copy them into `DEMO_PATIENT_IDS` in `sepsis_fhir_batch.py`: + + ``` + ✓ Uploaded to Medplum! + + Copy this into sepsis_fhir_batch.py: + + DEMO_PATIENT_IDS = [ + "702e11e8-6d21-41dd-9b48-31715fdc0fb1", # high risk + "3b0da7e9-0379-455a-8d35-bedd3a6ee459", # moderate risk + "f490ceb4-6262-4f1e-8b72-5515e6c46741", # low risk + ] + ``` + + Options for larger test sets: `--num-patients-per-risk 5`, `--seed 123`, `--help`. + +### Screen Patients and Write Back Results + +Configure the [FHIRGateway](../reference/gateway/fhir_gateway.md), run predictions, and write [RiskAssessment](https://www.hl7.org/fhir/riskassessment.html) resources back to the server: ```python -from healthchain.fhir.r4b import Patient, Observation from healthchain.gateway import FHIRGateway from healthchain.gateway.clients.fhir.base import FHIRAuthConfig +from healthchain.fhir.r4b import Patient, Observation from healthchain.fhir import merge_bundles gateway = FHIRGateway() -config = FHIRAuthConfig.from_env("MEDPLUM") -gateway.add_source("medplum", config.to_connection_string()) -``` +gateway.add_source("medplum", FHIRAuthConfig.from_env("MEDPLUM").to_connection_string()) -### Screen Individual Patients - -Query patient data, run prediction, and write back a [RiskAssessment](https://www.hl7.org/fhir/riskassessment.html) resource: - -```python -def screen_patient(gateway: FHIRGateway, patient_id: str, source: str): - # Query patient + observations from FHIR server - patient_bundle = gateway.search(Patient, {"_id": patient_id}, source) - obs_bundle = gateway.search(Observation, {"patient": patient_id}, source) +def screen_patient(patient_id: str): + patient_bundle = gateway.search(Patient, {"_id": patient_id}, "medplum") + obs_bundle = gateway.search(Observation, {"patient": patient_id}, "medplum") bundle = merge_bundles([patient_bundle, obs_bundle]) - # FHIR → Dataset → Prediction dataset = Dataset.from_fhir_bundle(bundle, schema=SCHEMA_PATH) result = pipeline(dataset) - # Convert to RiskAssessment and write back for ra in result.to_risk_assessment( outcome_code="A41.9", outcome_display="Sepsis", model_name="sepsis_xgboost_v1", ): - gateway.create(ra, source=source) -``` - -### Batch Screen Multiple Patients - -Loop over patient IDs and screen each one: + gateway.create(ra, source="medplum") -```python -for patient_id in patient_ids: - screen_patient(gateway, patient_id, source="medplum") +for patient_id in DEMO_PATIENT_IDS: + screen_patient(patient_id) ``` !!! note "Demo vs Production" - This demo uses a fixed list of patient IDs. In production, you'd query for patients dynamically—for example, ICU admissions in the last hour: + This demo uses a fixed list of patient IDs. In production, query for patients dynamically — for example, ICU admissions in the last hour: ```python - # Find patients with recent ICU encounters encounters = gateway.search( Encounter, - { - "class": "IMP", # inpatient - "location": "icu", - "date": "ge2024-01-01", - }, + {"class": "IMP", "location": "icu", "date": "ge2024-01-01"}, source="ehr" ) patient_ids = [e.subject.reference.split("/")[1] for e in encounters] ``` -### Build the Service - -```python -app = HealthChainAPI(title="Sepsis Batch Screening") -app.register_gateway(gateway, path="/fhir") -``` - ### Expected Output -After uploading demo patients to Medplum and running batch screening: - ``` === Screening patients from Medplum === - 702e11e8-6d21-41dd-9b48-31715fdc0fb1: HIGH (85%) → RiskAssessment/abc123 - 3b0da7e9-0379-455a-8d35-bedd3a6ee459: MODERATE (52%) → RiskAssessment/def456 - f490ceb4-6262-4f1e-8b72-5515e6c46741: LOW (15%) → RiskAssessment/ghi789 + 702e11e8-...: HIGH (85%) → RiskAssessment/abc123 + 3b0da7e9-...: MODERATE (52%) → RiskAssessment/def456 + f490ceb4-...: LOW (15%) → RiskAssessment/ghi789 ``` -You should be able to see the RiskAssessment resources in the [Medplum console](https://app.medplum.com) (search for "RiskAssessment" in "Resource Type" search bar in top left corner) + +RiskAssessment resources are visible in the [Medplum console](https://app.medplum.com) — search "RiskAssessment" in the resource type search bar. ??? example "Example RiskAssessment Resource" @@ -442,9 +329,7 @@ You should be able to see the RiskAssessment resources in the [Medplum console]( "resourceType": "RiskAssessment", "id": "abc123", "status": "final", - "subject": { - "reference": "Patient/702e11e8-6d21-41dd-9b48-31715fdc0fb1" - }, + "subject": { "reference": "Patient/702e11e8-6d21-41dd-9b48-31715fdc0fb1" }, "method": { "coding": [{ "system": "https://healthchain.io/models", @@ -487,9 +372,9 @@ Two deployment patterns for the same ML model: Both patterns: -- **Share the same model** - Train once, deploy multiple ways -- **Use YAML feature schemas** - Declarative FHIR → features mapping -- **Handle FHIR natively** - No custom data wrangling per integration +- **Share the same model** — train once, deploy multiple ways +- **Use YAML feature schemas** — declarative FHIR → features mapping, no custom parsing +- **Handle FHIR natively** — no custom data wrangling per integration !!! info "Use Cases" @@ -508,9 +393,8 @@ Both patterns: !!! tip "Next Steps" - - **Train your own model**: Replace `sepsis_model.pkl` with your model; update the feature schema to match - - **Add more features**: Extend `sepsis_vitals.yaml` with lab values, medications, or other Observations - - **Add more FHIR sources**: The gateway supports multiple sources—see the cookbook script for Epic sandbox configuration, or the [FHIR Sandbox Setup guide](./setup_fhir_sandboxes.md) - - **Automate batch runs**: Schedule screening jobs with cron, Airflow, or cloud schedulers; or use [FHIR Subscriptions](https://www.hl7.org/fhir/subscription.html) to trigger on new ICU admissions ([PRs welcome!](https://github.com/dotimplement/HealthChain/pulls)) + - **Train your own model**: Replace `sepsis_model.pkl` with your model; update the feature schema to match your features + - **Add more FHIR sources**: The gateway supports multiple sources — see the [FHIR Sandbox Setup guide](./setup_fhir_sandboxes.md) - **Combine patterns**: Use batch screening to identify high-risk patients, then enable CDS alerts for those patients - - **Go to production**: Scaffold a project with `healthchain new` and run with `healthchain serve` — see [From cookbook to service](./index.md#from-cookbook-to-service). + - **Automate batch runs**: Schedule screening jobs with cron, Airflow, or cloud schedulers; or use [FHIR Subscriptions](https://www.hl7.org/fhir/subscription.html) to trigger on new ICU admissions ([PRs welcome!](https://github.com/dotimplement/HealthChain/pulls)) + - **Go to production**: Scaffold a project with `healthchain new` and run with `healthchain serve` — see [From cookbook to service](./index.md#from-cookbook-to-service). Moving to `healthchain.yaml` is where config-driven compliance support (audit logging, model versioning, deployment metadata) will live as those features mature. diff --git a/docs/cookbook/multi_ehr_aggregation.md b/docs/cookbook/multi_ehr_aggregation.md index 0603b08c..3b25bbc4 100644 --- a/docs/cookbook/multi_ehr_aggregation.md +++ b/docs/cookbook/multi_ehr_aggregation.md @@ -403,10 +403,14 @@ A production-ready data aggregation service with: - **Training Data for AI Models**: Aggregate diverse patient data across EHR vendors for model training. Provenance tags enable stratified analysis (e.g., "how does model performance vary by data source?"). +!!! note "HealthChain complements your existing stack" + + HealthChain handles FHIR authentication, multi-source querying, provenance tracking, and deduplication. Your downstream tools — LangChain, pandas, custom NLP — stay exactly as they are. The aggregated Bundle or `Document` drops straight into whatever processing you already have. + !!! tip "Next Steps" - **Try another FHIR server**: Set up a different [FHIR server](./setup_fhir_sandboxes.md) where you can upload the same test patients to multiple instances for true multi-source aggregation. - **Expand resource types**: Change `Condition` to `MedicationStatement`, `Observation`, or `Procedure` to aggregate different data. - **Add processing**: Extend the pipeline with terminology mapping, entity extraction, or quality checks. - **Build on it**: Use aggregated data in the [Clinical Coding tutorial](./clinical_coding.md) or feed it to your LLM application. - - **Go to production**: Scaffold a project with `healthchain new` and run with `healthchain serve` — see [From cookbook to service](./index.md#from-cookbook-to-service). + - **Go to production**: Scaffold a project with `healthchain new` and run with `healthchain serve` — see [From cookbook to service](./index.md#from-cookbook-to-service). Moving to `healthchain.yaml` is where config-driven compliance support (audit logging, data provenance, deployment metadata) will live as those features mature. diff --git a/healthchain/cli.py b/healthchain/cli.py index aad282b9..c65cc244 100644 --- a/healthchain/cli.py +++ b/healthchain/cli.py @@ -165,34 +165,30 @@ def patient_view(request: CDSRequest) -> CDSResponse: """ _APP_PY_FHIR_GATEWAY = """\ -import os from typing import List -from healthchain.fhir.r4b import Bundle -from healthchain.fhir.r4b import Condition - +from healthchain.fhir.r4b import Bundle, Condition from healthchain.gateway import FHIRGateway, HealthChainAPI +from healthchain.config.appconfig import AppConfig from healthchain.fhir import merge_bundles from healthchain.io.containers import Document from healthchain.pipeline import Pipeline -# Add FHIR source credentials to .env (see .env.example) -gateway = FHIRGateway() - -epic_url = os.getenv("EPIC_BASE_URL") -cerner_url = os.getenv("CERNER_BASE_URL") +# Loads .env then healthchain.yaml — sources are declared there, credentials stay in .env +config = AppConfig.load() +gateway = FHIRGateway.from_config(config) -if epic_url: - gateway.add_source("epic", epic_url) -if cerner_url: - gateway.add_source("cerner", cerner_url) +# To enable LLM processing: add an llm: section to healthchain.yaml, then uncomment: +# llm = config.llm.to_langchain() if config.llm else None -# Add your NLP/ML/LLM processing steps here pipeline = Pipeline[Document]() @pipeline.add_node def process(doc: Document) -> Document: + # Add your NLP/ML/LLM processing steps here + # if llm: + # response = llm.invoke(doc.text) return doc @@ -226,6 +222,22 @@ def get_patient_conditions(patient_id: str, sources: List[str]) -> Bundle: def _make_healthchain_yaml(name: str, service_type: str) -> str: + if service_type == "fhir-gateway": + sources_block = """\ +# FHIR data sources — credentials stay in .env, source names declared here +# FHIRGateway.from_config(config) in app.py wires these up automatically +sources: + epic: + env_prefix: EPIC # reads EPIC_CLIENT_ID, EPIC_BASE_URL, EPIC_TOKEN_URL from .env + cerner: + env_prefix: CERNER # reads CERNER_BASE_URL from .env (no auth for open sandbox)""" + else: + sources_block = """\ +# FHIR data sources — declare sources here, credentials stay in .env +# sources: +# epic: +# env_prefix: EPIC # reads EPIC_CLIENT_ID, EPIC_BASE_URL, EPIC_TOKEN_URL from .env""" + return f"""\ # HealthChain application configuration # https://dotimplement.github.io/HealthChain/reference/config @@ -273,14 +285,9 @@ def _make_healthchain_yaml(name: str, service_type: str) -> str: name: "" environment: development # development | staging | production -# FHIR data sources — declare sources here, credentials stay in .env -# sources: -# medplum: -# env_prefix: MEDPLUM # reads MEDPLUM_CLIENT_ID, MEDPLUM_BASE_URL etc. -# epic: -# env_prefix: EPIC # reads EPIC_CLIENT_ID, EPIC_BASE_URL etc. +{sources_block} -# LLM provider (used by app.py or cookbooks via config.llm.to_langchain()) +# LLM provider — uncomment and set llm = config.llm.to_langchain() in app.py to enable # llm: # provider: anthropic # anthropic | openai | google | huggingface # model: claude-opus-4-6 diff --git a/healthchain/config/appconfig.py b/healthchain/config/appconfig.py index d3da5eb7..c05f040b 100644 --- a/healthchain/config/appconfig.py +++ b/healthchain/config/appconfig.py @@ -146,7 +146,18 @@ def from_yaml(cls, path: Path) -> "AppConfig": @classmethod def load(cls) -> Optional["AppConfig"]: - """Load healthchain.yaml from the current working directory if it exists.""" + """Load healthchain.yaml from the current working directory if it exists. + + Also loads .env into the environment before parsing config, so credentials + are available to any component initialised from the returned config object. + """ + try: + from dotenv import load_dotenv + + load_dotenv() + except ImportError: + pass + config_path = Path(_CONFIG_FILENAME) if config_path.exists(): try: diff --git a/healthchain/gateway/api/app.py b/healthchain/gateway/api/app.py index 4d4d437c..e988fc23 100644 --- a/healthchain/gateway/api/app.py +++ b/healthchain/gateway/api/app.py @@ -247,7 +247,6 @@ def __init__( title: str = "HealthChain API", description: str = "Healthcare Integration API", version: str = "1.0.0", - port: Optional[int] = None, service_type: Optional[str] = None, enable_cors: bool = True, enable_events: bool = True, @@ -275,7 +274,7 @@ def __init__( ) # Display metadata for banner (when running outside healthchain serve) - self._port = port + self._port: Optional[int] = None self._service_type = service_type # Gateway and service registries @@ -635,7 +634,7 @@ async def _startup(self) -> None: except Exception as e: logger.warning(f"Failed to initialize {name}: {e}") - def run(self, host: str = "0.0.0.0", **kwargs) -> None: + def run(self, host: str = "0.0.0.0", port: int = 8000, **kwargs) -> None: """Run the application with uvicorn. Convenience wrapper for local development and cookbooks. For production, @@ -643,16 +642,18 @@ def run(self, host: str = "0.0.0.0", **kwargs) -> None: Args: host: Host to bind to (default: 0.0.0.0) + port: Port to bind to (default: 8000) **kwargs: Passed through to uvicorn.run (e.g. reload=True, workers=4) Example: - app = HealthChainAPI(title="My App", port=8888) - app.run() - app.run(reload=True) # with hot reload + app = HealthChainAPI(title="My App") + app.run(port=8888) + app.run(port=8888, reload=True) # with hot reload """ import uvicorn - uvicorn.run(self, host=host, port=self._port or 8000, **kwargs) + self._port = port + uvicorn.run(self, host=host, port=port, **kwargs) async def _shutdown(self) -> None: """Handle graceful shutdown.""" diff --git a/mkdocs.yml b/mkdocs.yml index bc87c06e..52bf2d41 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -26,6 +26,7 @@ nav: - Cookbook: - cookbook/index.md - Setup FHIR Sandbox: cookbook/setup_fhir_sandboxes.md + - FHIR-Grounded Patient Q&A: cookbook/fhir_qa.md - Multi-Source Data Integration: cookbook/multi_ehr_aggregation.md - Automated Clinical Coding: cookbook/clinical_coding.md - Discharge Summarizer: cookbook/discharge_summarizer.md diff --git a/pyproject.toml b/pyproject.toml index 7a18dacd..525e2159 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -49,6 +49,7 @@ dependencies = [ "fastapi-events>=0.12.2,<0.13", "jwt>=1.3.1,<2", "pyyaml>=6.0.3,<7", + "python-dotenv>=1.0.0,<2", ] include = [ "healthchain/templates/*", @@ -70,6 +71,7 @@ dev = [ "pre-commit>=3.5.0,<4", "pytest-asyncio>=0.24.0,<0.25", "ipykernel>=6.29.5,<7", + "pytest-cov>=7.1.0", ] docs = [ "mkdocs>=1.6.1,<2", diff --git a/tests/integration_tests/test_cookbook_imports.py b/tests/integration_tests/test_cookbook_imports.py new file mode 100644 index 00000000..353af841 --- /dev/null +++ b/tests/integration_tests/test_cookbook_imports.py @@ -0,0 +1,34 @@ +"""Smoke tests to verify cookbook examples import without errors. + +Run locally after making changes to HealthChainAPI or gateway interfaces. +""" + +import importlib.util +import sys +from pathlib import Path + +import pytest + +COOKBOOK_DIR = Path(__file__).parents[2] / "cookbook" + +COOKBOOKS = [ + "cds_discharge_summarizer_hf_chat", + "cds_discharge_summarizer_hf_trf", + "fhir_context_llm_qa", + "multi_ehr_data_aggregation", + "notereader_clinical_coding_fhir", + "sepsis_cds_hooks", + "sepsis_fhir_batch", +] + + +@pytest.mark.skip(reason="local only") +@pytest.mark.parametrize("name", COOKBOOKS) +def test_cookbook_imports(name): + path = COOKBOOK_DIR / f"{name}.py" + assert path.exists(), f"Cookbook file not found: {path}" + + spec = importlib.util.spec_from_file_location(name, path) + module = importlib.util.module_from_spec(spec) + sys.modules[name] = module + spec.loader.exec_module(module) diff --git a/uv.lock b/uv.lock index 71bfce78..29d781fd 100644 --- a/uv.lock +++ b/uv.lock @@ -333,6 +333,94 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/0c/00/3106b1854b45bd0474ced037dfe6b73b90fe68a68968cef47c23de3d43d2/confection-0.1.5-py3-none-any.whl", hash = "sha256:e29d3c3f8eac06b3f77eb9dfb4bf2fc6bcc9622a98ca00a698e3d019c6430b14", size = 35451, upload-time = "2024-05-31T16:16:59.075Z" }, ] +[[package]] +name = "coverage" +version = "7.13.5" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9d/e0/70553e3000e345daff267cec284ce4cbf3fc141b6da229ac52775b5428f1/coverage-7.13.5.tar.gz", hash = "sha256:c81f6515c4c40141f83f502b07bbfa5c240ba25bbe73da7b33f1e5b6120ff179", size = 915967, upload-time = "2026-03-17T10:33:18.341Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/69/33/e8c48488c29a73fd089f9d71f9653c1be7478f2ad6b5bc870db11a55d23d/coverage-7.13.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e0723d2c96324561b9aa76fb982406e11d93cdb388a7a7da2b16e04719cf7ca5", size = 219255, upload-time = "2026-03-17T10:29:51.081Z" }, + { url = "https://files.pythonhosted.org/packages/da/bd/b0ebe9f677d7f4b74a3e115eec7ddd4bcf892074963a00d91e8b164a6386/coverage-7.13.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:52f444e86475992506b32d4e5ca55c24fc88d73bcbda0e9745095b28ef4dc0cf", size = 219772, upload-time = "2026-03-17T10:29:52.867Z" }, + { url = "https://files.pythonhosted.org/packages/48/cc/5cb9502f4e01972f54eedd48218bb203fe81e294be606a2bc93970208013/coverage-7.13.5-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:704de6328e3d612a8f6c07000a878ff38181ec3263d5a11da1db294fa6a9bdf8", size = 246532, upload-time = "2026-03-17T10:29:54.688Z" }, + { url = "https://files.pythonhosted.org/packages/7d/d8/3217636d86c7e7b12e126e4f30ef1581047da73140614523af7495ed5f2d/coverage-7.13.5-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a1a6d79a14e1ec1832cabc833898636ad5f3754a678ef8bb4908515208bf84f4", size = 248333, upload-time = "2026-03-17T10:29:56.221Z" }, + { url = "https://files.pythonhosted.org/packages/2b/30/2002ac6729ba2d4357438e2ed3c447ad8562866c8c63fc16f6dfc33afe56/coverage-7.13.5-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:79060214983769c7ba3f0cee10b54c97609dca4d478fa1aa32b914480fd5738d", size = 250211, upload-time = "2026-03-17T10:29:57.938Z" }, + { url = "https://files.pythonhosted.org/packages/6c/85/552496626d6b9359eb0e2f86f920037c9cbfba09b24d914c6e1528155f7d/coverage-7.13.5-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:356e76b46783a98c2a2fe81ec79df4883a1e62895ea952968fb253c114e7f930", size = 252125, upload-time = "2026-03-17T10:29:59.388Z" }, + { url = "https://files.pythonhosted.org/packages/44/21/40256eabdcbccdb6acf6b381b3016a154399a75fe39d406f790ae84d1f3c/coverage-7.13.5-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0cef0cdec915d11254a7f549c1170afecce708d30610c6abdded1f74e581666d", size = 247219, upload-time = "2026-03-17T10:30:01.199Z" }, + { url = "https://files.pythonhosted.org/packages/b1/e8/96e2a6c3f21a0ea77d7830b254a1542d0328acc8d7bdf6a284ba7e529f77/coverage-7.13.5-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:dc022073d063b25a402454e5712ef9e007113e3a676b96c5f29b2bda29352f40", size = 248248, upload-time = "2026-03-17T10:30:03.317Z" }, + { url = "https://files.pythonhosted.org/packages/da/ba/8477f549e554827da390ec659f3c38e4b6d95470f4daafc2d8ff94eaa9c2/coverage-7.13.5-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9b74db26dfea4f4e50d48a4602207cd1e78be33182bc9cbf22da94f332f99878", size = 246254, upload-time = "2026-03-17T10:30:04.832Z" }, + { url = "https://files.pythonhosted.org/packages/55/59/bc22aef0e6aa179d5b1b001e8b3654785e9adf27ef24c93dc4228ebd5d68/coverage-7.13.5-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:ad146744ca4fd09b50c482650e3c1b1f4dfa1d4792e0a04a369c7f23336f0400", size = 250067, upload-time = "2026-03-17T10:30:06.535Z" }, + { url = "https://files.pythonhosted.org/packages/de/1b/c6a023a160806a5137dca53468fd97530d6acad24a22003b1578a9c2e429/coverage-7.13.5-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:c555b48be1853fe3997c11c4bd521cdd9a9612352de01fa4508f16ec341e6fe0", size = 246521, upload-time = "2026-03-17T10:30:08.486Z" }, + { url = "https://files.pythonhosted.org/packages/2d/3f/3532c85a55aa2f899fa17c186f831cfa1aa434d88ff792a709636f64130e/coverage-7.13.5-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:7034b5c56a58ae5e85f23949d52c14aca2cfc6848a31764995b7de88f13a1ea0", size = 247126, upload-time = "2026-03-17T10:30:09.966Z" }, + { url = "https://files.pythonhosted.org/packages/aa/2e/b9d56af4a24ef45dfbcda88e06870cb7d57b2b0bfa3a888d79b4c8debd76/coverage-7.13.5-cp310-cp310-win32.whl", hash = "sha256:eb7fdf1ef130660e7415e0253a01a7d5a88c9c4d158bcf75cbbd922fd65a5b58", size = 221860, upload-time = "2026-03-17T10:30:11.393Z" }, + { url = "https://files.pythonhosted.org/packages/9f/cc/d938417e7a4d7f0433ad4edee8bb2acdc60dc7ac5af19e2a07a048ecbee3/coverage-7.13.5-cp310-cp310-win_amd64.whl", hash = "sha256:3e1bb5f6c78feeb1be3475789b14a0f0a5b47d505bfc7267126ccbd50289999e", size = 222788, upload-time = "2026-03-17T10:30:12.886Z" }, + { url = "https://files.pythonhosted.org/packages/4b/37/d24c8f8220ff07b839b2c043ea4903a33b0f455abe673ae3c03bbdb7f212/coverage-7.13.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:66a80c616f80181f4d643b0f9e709d97bcea413ecd9631e1dedc7401c8e6695d", size = 219381, upload-time = "2026-03-17T10:30:14.68Z" }, + { url = "https://files.pythonhosted.org/packages/35/8b/cd129b0ca4afe886a6ce9d183c44d8301acbd4ef248622e7c49a23145605/coverage-7.13.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:145ede53ccbafb297c1c9287f788d1bc3efd6c900da23bf6931b09eafc931587", size = 219880, upload-time = "2026-03-17T10:30:16.231Z" }, + { url = "https://files.pythonhosted.org/packages/55/2f/e0e5b237bffdb5d6c530ce87cc1d413a5b7d7dfd60fb067ad6d254c35c76/coverage-7.13.5-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:0672854dc733c342fa3e957e0605256d2bf5934feeac328da9e0b5449634a642", size = 250303, upload-time = "2026-03-17T10:30:17.748Z" }, + { url = "https://files.pythonhosted.org/packages/92/be/b1afb692be85b947f3401375851484496134c5554e67e822c35f28bf2fbc/coverage-7.13.5-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:ec10e2a42b41c923c2209b846126c6582db5e43a33157e9870ba9fb70dc7854b", size = 252218, upload-time = "2026-03-17T10:30:19.804Z" }, + { url = "https://files.pythonhosted.org/packages/da/69/2f47bb6fa1b8d1e3e5d0c4be8ccb4313c63d742476a619418f85740d597b/coverage-7.13.5-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be3d4bbad9d4b037791794ddeedd7d64a56f5933a2c1373e18e9e568b9141686", size = 254326, upload-time = "2026-03-17T10:30:21.321Z" }, + { url = "https://files.pythonhosted.org/packages/d5/d0/79db81da58965bd29dabc8f4ad2a2af70611a57cba9d1ec006f072f30a54/coverage-7.13.5-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4d2afbc5cc54d286bfb54541aa50b64cdb07a718227168c87b9e2fb8f25e1743", size = 256267, upload-time = "2026-03-17T10:30:23.094Z" }, + { url = "https://files.pythonhosted.org/packages/e5/32/d0d7cc8168f91ddab44c0ce4806b969df5f5fdfdbb568eaca2dbc2a04936/coverage-7.13.5-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3ad050321264c49c2fa67bb599100456fc51d004b82534f379d16445da40fb75", size = 250430, upload-time = "2026-03-17T10:30:25.311Z" }, + { url = "https://files.pythonhosted.org/packages/4d/06/a055311d891ddbe231cd69fdd20ea4be6e3603ffebddf8704b8ca8e10a3c/coverage-7.13.5-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7300c8a6d13335b29bb76d7651c66af6bd8658517c43499f110ddc6717bfc209", size = 252017, upload-time = "2026-03-17T10:30:27.284Z" }, + { url = "https://files.pythonhosted.org/packages/d6/f6/d0fd2d21e29a657b5f77a2fe7082e1568158340dceb941954f776dce1b7b/coverage-7.13.5-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:eb07647a5738b89baab047f14edd18ded523de60f3b30e75c2acc826f79c839a", size = 250080, upload-time = "2026-03-17T10:30:29.481Z" }, + { url = "https://files.pythonhosted.org/packages/4e/ab/0d7fb2efc2e9a5eb7ddcc6e722f834a69b454b7e6e5888c3a8567ecffb31/coverage-7.13.5-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:9adb6688e3b53adffefd4a52d72cbd8b02602bfb8f74dcd862337182fd4d1a4e", size = 253843, upload-time = "2026-03-17T10:30:31.301Z" }, + { url = "https://files.pythonhosted.org/packages/ba/6f/7467b917bbf5408610178f62a49c0ed4377bb16c1657f689cc61470da8ce/coverage-7.13.5-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7c8d4bc913dd70b93488d6c496c77f3aff5ea99a07e36a18f865bca55adef8bd", size = 249802, upload-time = "2026-03-17T10:30:33.358Z" }, + { url = "https://files.pythonhosted.org/packages/75/2c/1172fb689df92135f5bfbbd69fc83017a76d24ea2e2f3a1154007e2fb9f8/coverage-7.13.5-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0e3c426ffc4cd952f54ee9ffbdd10345709ecc78a3ecfd796a57236bfad0b9b8", size = 250707, upload-time = "2026-03-17T10:30:35.2Z" }, + { url = "https://files.pythonhosted.org/packages/67/21/9ac389377380a07884e3b48ba7a620fcd9dbfaf1d40565facdc6b36ec9ef/coverage-7.13.5-cp311-cp311-win32.whl", hash = "sha256:259b69bb83ad9894c4b25be2528139eecba9a82646ebdda2d9db1ba28424a6bf", size = 221880, upload-time = "2026-03-17T10:30:36.775Z" }, + { url = "https://files.pythonhosted.org/packages/af/7f/4cd8a92531253f9d7c1bbecd9fa1b472907fb54446ca768c59b531248dc5/coverage-7.13.5-cp311-cp311-win_amd64.whl", hash = "sha256:258354455f4e86e3e9d0d17571d522e13b4e1e19bf0f8596bcf9476d61e7d8a9", size = 222816, upload-time = "2026-03-17T10:30:38.891Z" }, + { url = "https://files.pythonhosted.org/packages/12/a6/1d3f6155fb0010ca68eba7fe48ca6c9da7385058b77a95848710ecf189b1/coverage-7.13.5-cp311-cp311-win_arm64.whl", hash = "sha256:bff95879c33ec8da99fc9b6fe345ddb5be6414b41d6d1ad1c8f188d26f36e028", size = 221483, upload-time = "2026-03-17T10:30:40.463Z" }, + { url = "https://files.pythonhosted.org/packages/a0/c3/a396306ba7db865bf96fc1fb3b7fd29bcbf3d829df642e77b13555163cd6/coverage-7.13.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:460cf0114c5016fa841214ff5564aa4864f11948da9440bc97e21ad1f4ba1e01", size = 219554, upload-time = "2026-03-17T10:30:42.208Z" }, + { url = "https://files.pythonhosted.org/packages/a6/16/a68a19e5384e93f811dccc51034b1fd0b865841c390e3c931dcc4699e035/coverage-7.13.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0e223ce4b4ed47f065bfb123687686512e37629be25cc63728557ae7db261422", size = 219908, upload-time = "2026-03-17T10:30:43.906Z" }, + { url = "https://files.pythonhosted.org/packages/29/72/20b917c6793af3a5ceb7fb9c50033f3ec7865f2911a1416b34a7cfa0813b/coverage-7.13.5-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:6e3370441f4513c6252bf042b9c36d22491142385049243253c7e48398a15a9f", size = 251419, upload-time = "2026-03-17T10:30:45.545Z" }, + { url = "https://files.pythonhosted.org/packages/8c/49/cd14b789536ac6a4778c453c6a2338bc0a2fb60c5a5a41b4008328b9acc1/coverage-7.13.5-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:03ccc709a17a1de074fb1d11f217342fb0d2b1582ed544f554fc9fc3f07e95f5", size = 254159, upload-time = "2026-03-17T10:30:47.204Z" }, + { url = "https://files.pythonhosted.org/packages/9d/00/7b0edcfe64e2ed4c0340dac14a52ad0f4c9bd0b8b5e531af7d55b703db7c/coverage-7.13.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3f4818d065964db3c1c66dc0fbdac5ac692ecbc875555e13374fdbe7eedb4376", size = 255270, upload-time = "2026-03-17T10:30:48.812Z" }, + { url = "https://files.pythonhosted.org/packages/93/89/7ffc4ba0f5d0a55c1e84ea7cee39c9fc06af7b170513d83fbf3bbefce280/coverage-7.13.5-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:012d5319e66e9d5a218834642d6c35d265515a62f01157a45bcc036ecf947256", size = 257538, upload-time = "2026-03-17T10:30:50.77Z" }, + { url = "https://files.pythonhosted.org/packages/81/bd/73ddf85f93f7e6fa83e77ccecb6162d9415c79007b4bc124008a4995e4a7/coverage-7.13.5-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8dd02af98971bdb956363e4827d34425cb3df19ee550ef92855b0acb9c7ce51c", size = 251821, upload-time = "2026-03-17T10:30:52.5Z" }, + { url = "https://files.pythonhosted.org/packages/a0/81/278aff4e8dec4926a0bcb9486320752811f543a3ce5b602cc7a29978d073/coverage-7.13.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:f08fd75c50a760c7eb068ae823777268daaf16a80b918fa58eea888f8e3919f5", size = 253191, upload-time = "2026-03-17T10:30:54.543Z" }, + { url = "https://files.pythonhosted.org/packages/70/ee/fe1621488e2e0a58d7e94c4800f0d96f79671553488d401a612bebae324b/coverage-7.13.5-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:843ea8643cf967d1ac7e8ecd4bb00c99135adf4816c0c0593fdcc47b597fcf09", size = 251337, upload-time = "2026-03-17T10:30:56.663Z" }, + { url = "https://files.pythonhosted.org/packages/37/a6/f79fb37aa104b562207cc23cb5711ab6793608e246cae1e93f26b2236ed9/coverage-7.13.5-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:9d44d7aa963820b1b971dbecd90bfe5fe8f81cff79787eb6cca15750bd2f79b9", size = 255404, upload-time = "2026-03-17T10:30:58.427Z" }, + { url = "https://files.pythonhosted.org/packages/75/f0/ed15262a58ec81ce457ceb717b7f78752a1713556b19081b76e90896e8d4/coverage-7.13.5-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:7132bed4bd7b836200c591410ae7d97bf7ae8be6fc87d160b2bd881df929e7bf", size = 250903, upload-time = "2026-03-17T10:31:00.093Z" }, + { url = "https://files.pythonhosted.org/packages/0f/e9/9129958f20e7e9d4d56d51d42ccf708d15cac355ff4ac6e736e97a9393d2/coverage-7.13.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a698e363641b98843c517817db75373c83254781426e94ada3197cabbc2c919c", size = 252780, upload-time = "2026-03-17T10:31:01.916Z" }, + { url = "https://files.pythonhosted.org/packages/a4/d7/0ad9b15812d81272db94379fe4c6df8fd17781cc7671fdfa30c76ba5ff7b/coverage-7.13.5-cp312-cp312-win32.whl", hash = "sha256:bdba0a6b8812e8c7df002d908a9a2ea3c36e92611b5708633c50869e6d922fdf", size = 222093, upload-time = "2026-03-17T10:31:03.642Z" }, + { url = "https://files.pythonhosted.org/packages/29/3d/821a9a5799fac2556bcf0bd37a70d1d11fa9e49784b6d22e92e8b2f85f18/coverage-7.13.5-cp312-cp312-win_amd64.whl", hash = "sha256:d2c87e0c473a10bffe991502eac389220533024c8082ec1ce849f4218dded810", size = 222900, upload-time = "2026-03-17T10:31:05.651Z" }, + { url = "https://files.pythonhosted.org/packages/d4/fa/2238c2ad08e35cf4f020ea721f717e09ec3152aea75d191a7faf3ef009a8/coverage-7.13.5-cp312-cp312-win_arm64.whl", hash = "sha256:bf69236a9a81bdca3bff53796237aab096cdbf8d78a66ad61e992d9dac7eb2de", size = 221515, upload-time = "2026-03-17T10:31:07.293Z" }, + { url = "https://files.pythonhosted.org/packages/74/8c/74fedc9663dcf168b0a059d4ea756ecae4da77a489048f94b5f512a8d0b3/coverage-7.13.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5ec4af212df513e399cf11610cc27063f1586419e814755ab362e50a85ea69c1", size = 219576, upload-time = "2026-03-17T10:31:09.045Z" }, + { url = "https://files.pythonhosted.org/packages/0c/c9/44fb661c55062f0818a6ffd2685c67aa30816200d5f2817543717d4b92eb/coverage-7.13.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:941617e518602e2d64942c88ec8499f7fbd49d3f6c4327d3a71d43a1973032f3", size = 219942, upload-time = "2026-03-17T10:31:10.708Z" }, + { url = "https://files.pythonhosted.org/packages/5f/13/93419671cee82b780bab7ea96b67c8ef448f5f295f36bf5031154ec9a790/coverage-7.13.5-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:da305e9937617ee95c2e39d8ff9f040e0487cbf1ac174f777ed5eddd7a7c1f26", size = 250935, upload-time = "2026-03-17T10:31:12.392Z" }, + { url = "https://files.pythonhosted.org/packages/ac/68/1666e3a4462f8202d836920114fa7a5ee9275d1fa45366d336c551a162dd/coverage-7.13.5-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:78e696e1cc714e57e8b25760b33a8b1026b7048d270140d25dafe1b0a1ee05a3", size = 253541, upload-time = "2026-03-17T10:31:14.247Z" }, + { url = "https://files.pythonhosted.org/packages/4e/5e/3ee3b835647be646dcf3c65a7c6c18f87c27326a858f72ab22c12730773d/coverage-7.13.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:02ca0eed225b2ff301c474aeeeae27d26e2537942aa0f87491d3e147e784a82b", size = 254780, upload-time = "2026-03-17T10:31:16.193Z" }, + { url = "https://files.pythonhosted.org/packages/44/b3/cb5bd1a04cfcc49ede6cd8409d80bee17661167686741e041abc7ee1b9a9/coverage-7.13.5-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:04690832cbea4e4663d9149e05dba142546ca05cb1848816760e7f58285c970a", size = 256912, upload-time = "2026-03-17T10:31:17.89Z" }, + { url = "https://files.pythonhosted.org/packages/1b/66/c1dceb7b9714473800b075f5c8a84f4588f887a90eb8645282031676e242/coverage-7.13.5-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0590e44dd2745c696a778f7bab6aa95256de2cbc8b8cff4f7db8ff09813d6969", size = 251165, upload-time = "2026-03-17T10:31:19.605Z" }, + { url = "https://files.pythonhosted.org/packages/b7/62/5502b73b97aa2e53ea22a39cf8649ff44827bef76d90bf638777daa27a9d/coverage-7.13.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d7cfad2d6d81dd298ab6b89fe72c3b7b05ec7544bdda3b707ddaecff8d25c161", size = 252908, upload-time = "2026-03-17T10:31:21.312Z" }, + { url = "https://files.pythonhosted.org/packages/7d/37/7792c2d69854397ca77a55c4646e5897c467928b0e27f2d235d83b5d08c6/coverage-7.13.5-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:e092b9499de38ae0fbfbc603a74660eb6ff3e869e507b50d85a13b6db9863e15", size = 250873, upload-time = "2026-03-17T10:31:23.565Z" }, + { url = "https://files.pythonhosted.org/packages/a3/23/bc866fb6163be52a8a9e5d708ba0d3b1283c12158cefca0a8bbb6e247a43/coverage-7.13.5-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:48c39bc4a04d983a54a705a6389512883d4a3b9862991b3617d547940e9f52b1", size = 255030, upload-time = "2026-03-17T10:31:25.58Z" }, + { url = "https://files.pythonhosted.org/packages/7d/8b/ef67e1c222ef49860701d346b8bbb70881bef283bd5f6cbba68a39a086c7/coverage-7.13.5-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2d3807015f138ffea1ed9afeeb8624fd781703f2858b62a8dd8da5a0994c57b6", size = 250694, upload-time = "2026-03-17T10:31:27.316Z" }, + { url = "https://files.pythonhosted.org/packages/46/0d/866d1f74f0acddbb906db212e096dee77a8e2158ca5e6bb44729f9d93298/coverage-7.13.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ee2aa19e03161671ec964004fb74b2257805d9710bf14a5c704558b9d8dbaf17", size = 252469, upload-time = "2026-03-17T10:31:29.472Z" }, + { url = "https://files.pythonhosted.org/packages/7a/f5/be742fec31118f02ce42b21c6af187ad6a344fed546b56ca60caacc6a9a0/coverage-7.13.5-cp313-cp313-win32.whl", hash = "sha256:ce1998c0483007608c8382f4ff50164bfc5bd07a2246dd272aa4043b75e61e85", size = 222112, upload-time = "2026-03-17T10:31:31.526Z" }, + { url = "https://files.pythonhosted.org/packages/66/40/7732d648ab9d069a46e686043241f01206348e2bbf128daea85be4d6414b/coverage-7.13.5-cp313-cp313-win_amd64.whl", hash = "sha256:631efb83f01569670a5e866ceb80fe483e7c159fac6f167e6571522636104a0b", size = 222923, upload-time = "2026-03-17T10:31:33.633Z" }, + { url = "https://files.pythonhosted.org/packages/48/af/fea819c12a095781f6ccd504890aaddaf88b8fab263c4940e82c7b770124/coverage-7.13.5-cp313-cp313-win_arm64.whl", hash = "sha256:f4cd16206ad171cbc2470dbea9103cf9a7607d5fe8c242fdf1edf36174020664", size = 221540, upload-time = "2026-03-17T10:31:35.445Z" }, + { url = "https://files.pythonhosted.org/packages/23/d2/17879af479df7fbbd44bd528a31692a48f6b25055d16482fdf5cdb633805/coverage-7.13.5-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0428cbef5783ad91fe240f673cc1f76b25e74bbfe1a13115e4aa30d3f538162d", size = 220262, upload-time = "2026-03-17T10:31:37.184Z" }, + { url = "https://files.pythonhosted.org/packages/5b/4c/d20e554f988c8f91d6a02c5118f9abbbf73a8768a3048cb4962230d5743f/coverage-7.13.5-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e0b216a19534b2427cc201a26c25da4a48633f29a487c61258643e89d28200c0", size = 220617, upload-time = "2026-03-17T10:31:39.245Z" }, + { url = "https://files.pythonhosted.org/packages/29/9c/f9f5277b95184f764b24e7231e166dfdb5780a46d408a2ac665969416d61/coverage-7.13.5-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:972a9cd27894afe4bc2b1480107054e062df08e671df7c2f18c205e805ccd806", size = 261912, upload-time = "2026-03-17T10:31:41.324Z" }, + { url = "https://files.pythonhosted.org/packages/d5/f6/7f1ab39393eeb50cfe4747ae8ef0e4fc564b989225aa1152e13a180d74f8/coverage-7.13.5-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4b59148601efcd2bac8c4dbf1f0ad6391693ccf7a74b8205781751637076aee3", size = 263987, upload-time = "2026-03-17T10:31:43.724Z" }, + { url = "https://files.pythonhosted.org/packages/a0/d7/62c084fb489ed9c6fbdf57e006752e7c516ea46fd690e5ed8b8617c7d52e/coverage-7.13.5-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:505d7083c8b0c87a8fa8c07370c285847c1f77739b22e299ad75a6af6c32c5c9", size = 266416, upload-time = "2026-03-17T10:31:45.769Z" }, + { url = "https://files.pythonhosted.org/packages/a9/f6/df63d8660e1a0bff6125947afda112a0502736f470d62ca68b288ea762d8/coverage-7.13.5-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:60365289c3741e4db327e7baff2a4aaacf22f788e80fa4683393891b70a89fbd", size = 267558, upload-time = "2026-03-17T10:31:48.293Z" }, + { url = "https://files.pythonhosted.org/packages/5b/02/353ca81d36779bd108f6d384425f7139ac3c58c750dcfaafe5d0bee6436b/coverage-7.13.5-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:1b88c69c8ef5d4b6fe7dea66d6636056a0f6a7527c440e890cf9259011f5e606", size = 261163, upload-time = "2026-03-17T10:31:50.125Z" }, + { url = "https://files.pythonhosted.org/packages/2c/16/2e79106d5749bcaf3aee6d309123548e3276517cd7851faa8da213bc61bf/coverage-7.13.5-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:5b13955d31d1633cf9376908089b7cebe7d15ddad7aeaabcbe969a595a97e95e", size = 263981, upload-time = "2026-03-17T10:31:51.961Z" }, + { url = "https://files.pythonhosted.org/packages/29/c7/c29e0c59ffa6942030ae6f50b88ae49988e7e8da06de7ecdbf49c6d4feae/coverage-7.13.5-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:f70c9ab2595c56f81a89620e22899eea8b212a4041bd728ac6f4a28bf5d3ddd0", size = 261604, upload-time = "2026-03-17T10:31:53.872Z" }, + { url = "https://files.pythonhosted.org/packages/40/48/097cdc3db342f34006a308ab41c3a7c11c3f0d84750d340f45d88a782e00/coverage-7.13.5-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:084b84a8c63e8d6fc7e3931b316a9bcafca1458d753c539db82d31ed20091a87", size = 265321, upload-time = "2026-03-17T10:31:55.997Z" }, + { url = "https://files.pythonhosted.org/packages/bb/1f/4994af354689e14fd03a75f8ec85a9a68d94e0188bbdab3fc1516b55e512/coverage-7.13.5-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:ad14385487393e386e2ea988b09d62dd42c397662ac2dabc3832d71253eee479", size = 260502, upload-time = "2026-03-17T10:31:58.308Z" }, + { url = "https://files.pythonhosted.org/packages/22/c6/9bb9ef55903e628033560885f5c31aa227e46878118b63ab15dc7ba87797/coverage-7.13.5-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:7f2c47b36fe7709a6e83bfadf4eefb90bd25fbe4014d715224c4316f808e59a2", size = 262688, upload-time = "2026-03-17T10:32:00.141Z" }, + { url = "https://files.pythonhosted.org/packages/14/4f/f5df9007e50b15e53e01edea486814783a7f019893733d9e4d6caad75557/coverage-7.13.5-cp313-cp313t-win32.whl", hash = "sha256:67e9bc5449801fad0e5dff329499fb090ba4c5800b86805c80617b4e29809b2a", size = 222788, upload-time = "2026-03-17T10:32:02.246Z" }, + { url = "https://files.pythonhosted.org/packages/e1/98/aa7fccaa97d0f3192bec013c4e6fd6d294a6ed44b640e6bb61f479e00ed5/coverage-7.13.5-cp313-cp313t-win_amd64.whl", hash = "sha256:da86cdcf10d2519e10cabb8ac2de03da1bcb6e4853790b7fbd48523332e3a819", size = 223851, upload-time = "2026-03-17T10:32:04.416Z" }, + { url = "https://files.pythonhosted.org/packages/3d/8b/e5c469f7352651e5f013198e9e21f97510b23de957dd06a84071683b4b60/coverage-7.13.5-cp313-cp313t-win_arm64.whl", hash = "sha256:0ecf12ecb326fe2c339d93fc131816f3a7367d223db37817208905c89bded911", size = 222104, upload-time = "2026-03-17T10:32:06.65Z" }, + { url = "https://files.pythonhosted.org/packages/9e/ee/a4cf96b8ce1e566ed238f0659ac2d3f007ed1d14b181bcb684e19561a69a/coverage-7.13.5-py3-none-any.whl", hash = "sha256:34b02417cf070e173989b3db962f7ed56d2f644307b2cf9d5a0f258e13084a61", size = 211346, upload-time = "2026-03-17T10:33:15.691Z" }, +] + +[package.optional-dependencies] +toml = [ + { name = "tomli", marker = "python_full_version <= '3.11'" }, +] + [[package]] name = "cryptography" version = "46.0.5" @@ -620,6 +708,7 @@ dependencies = [ { name = "numpy" }, { name = "pandas" }, { name = "pydantic" }, + { name = "python-dotenv" }, { name = "python-liquid" }, { name = "pyyaml" }, { name = "regex" }, @@ -636,6 +725,7 @@ dev = [ { name = "pre-commit" }, { name = "pytest" }, { name = "pytest-asyncio" }, + { name = "pytest-cov" }, { name = "ruff" }, ] docs = [ @@ -660,6 +750,7 @@ requires-dist = [ { name = "numpy", specifier = ">=1.26.0,<2.4.0" }, { name = "pandas", specifier = ">=1.0.0,<3.0.0" }, { name = "pydantic", specifier = ">=2.0.0,<2.11.0" }, + { name = "python-dotenv", specifier = ">=1.0.0,<2" }, { name = "python-liquid", specifier = ">=1.13.0,<2" }, { name = "pyyaml", specifier = ">=6.0.3,<7" }, { name = "regex", specifier = "!=2019.12.17" }, @@ -676,6 +767,7 @@ dev = [ { name = "pre-commit", specifier = ">=3.5.0,<4" }, { name = "pytest", specifier = ">=8.2.0,<9" }, { name = "pytest-asyncio", specifier = ">=0.24.0,<0.25" }, + { name = "pytest-cov", specifier = ">=7.1.0" }, { name = "ruff", specifier = ">=0.4.2,<0.5" }, ] docs = [ @@ -1698,6 +1790,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/96/31/6607dab48616902f76885dfcf62c08d929796fc3b2d2318faf9fd54dbed9/pytest_asyncio-0.24.0-py3-none-any.whl", hash = "sha256:a811296ed596b69bf0b6f3dc40f83bcaf341b155a269052d82efa2b25ac7037b", size = 18024, upload-time = "2024-08-22T08:03:15.536Z" }, ] +[[package]] +name = "pytest-cov" +version = "7.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "coverage", extra = ["toml"] }, + { name = "pluggy" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/51/a849f96e117386044471c8ec2bd6cfebacda285da9525c9106aeb28da671/pytest_cov-7.1.0.tar.gz", hash = "sha256:30674f2b5f6351aa09702a9c8c364f6a01c27aae0c1366ae8016160d1efc56b2", size = 55592, upload-time = "2026-03-21T20:11:16.284Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9d/7a/d968e294073affff457b041c2be9868a40c1c71f4a35fcc1e45e5493067b/pytest_cov-7.1.0-py3-none-any.whl", hash = "sha256:a0461110b7865f9a271aa1b51e516c9a95de9d696734a2f71e3e78f46e1d4678", size = 22876, upload-time = "2026-03-21T20:11:14.438Z" }, +] + [[package]] name = "python-dateutil" version = "2.9.0.post0" @@ -1710,6 +1816,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, ] +[[package]] +name = "python-dotenv" +version = "1.2.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/82/ed/0301aeeac3e5353ef3d94b6ec08bbcabd04a72018415dcb29e588514bba8/python_dotenv-1.2.2.tar.gz", hash = "sha256:2c371a91fbd7ba082c2c1dc1f8bf89ca22564a087c2c287cd9b662adde799cf3", size = 50135, upload-time = "2026-03-01T16:00:26.196Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0b/d7/1959b9648791274998a9c3526f6d0ec8fd2233e4d4acce81bbae76b44b2a/python_dotenv-1.2.2-py3-none-any.whl", hash = "sha256:1d8214789a24de455a8b8bd8ae6fe3c6b69a5e3d64aa8a8e5d68e694bbcb285a", size = 22101, upload-time = "2026-03-01T16:00:25.09Z" }, +] + [[package]] name = "python-liquid" version = "1.13.0"