Skip to content

Ollama hosting package should support multiple models #32

@aaronpowell

Description

@aaronpowell

Describe the bug

When you add the Ollama hosting package it allows you to specify a single model that you want Ollama to setup, but if you want to have multiple models (say chat and embeddings) then you have to create two hosting packages.

It would be better to work more like the Azure OpenAI where you can specify multiple deployments.

Regression

No response

Steps to reproduce

No response

Expected behavior

No response

Screenshots

No response

IDE and version

No response

IDE version

No response

Nuget packages

  • CommunityToolkit.Aspire.Hosting.Azure.StaticWebApps
  • CommunityToolkit.Aspire.Hosting.Java

Nuget package version(s)

No response

Additional context

No response

Help us help you

None

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions