Docker Model Runner¶
Since v0.37.0
Introduction¶
The Testcontainers module for DockerModelRunner.
Adding this module to your project dependencies¶
Please run the following command to add the DockerModelRunner module to your Go dependencies:
go get github.com/testcontainers/testcontainers-go/modules/dockermodelrunner
Usage example¶
ctx := context.Background()
const (
modelNamespace = "ai"
modelName = "smollm2"
modelTag = "360M-Q4_K_M"
fqModelName = modelNamespace + "/" + modelName + ":" + modelTag
)
dmrCtr, err := dockermodelrunner.Run(
ctx,
dockermodelrunner.WithModel(fqModelName),
)
defer func() {
if err := testcontainers.TerminateContainer(dmrCtr); err != nil {
log.Printf("failed to terminate container: %s", err)
}
}()
if err != nil {
log.Printf("failed to start container: %s", err)
return
}
Module Reference¶
Run function¶
- Since v0.37.0
The Docker Model Runner module exposes two entrypoint functions to create the Docker Model Runner container:
Run¶
This function receives two parameters:
func Run(ctx context.Context, opts ...testcontainers.ContainerCustomizer) (*Container, error)
context.Context, the Go context.testcontainers.ContainerCustomizer, a variadic argument for passing options.
Info
This function will use the default socat image under the hood. Please refer to the socat module for more information.
Image¶
Use the second argument in the Run function to set a valid Docker image.
In example: Run(context.Background(), "alpine/socat:1.8.0.1").
Container Options¶
When starting the Docker Model Runner container, you can pass options in a variadic way to configure it.
WithModel¶
- Since v0.37.0
Use the WithModel option to set the model to pull when the container is started. Please be aware, that only Models as OCI Artifacts are compatible with Docker Model Runner.
dockermodelrunner.WithModel("ai/llama3.2:latest")
Warning
Multiple calls to this function overrides the previous value.
You can find a curated collection of cutting-edge AI models as OCI Artifacts, from lightweight on-device models to high-performance LLMs on Docker Hub.
The following options are exposed by the testcontainers package.
Basic Options¶
WithExposedPortsSince v0.37.0WithEnvSince v0.29.0WithWaitStrategySince v0.20.0WithAdditionalWaitStrategySince v0.38.0WithWaitStrategyAndDeadlineSince v0.20.0WithAdditionalWaitStrategyAndDeadlineSince v0.38.0WithEntrypointSince v0.37.0WithEntrypointArgsSince v0.37.0WithCmdSince v0.37.0WithCmdArgsSince v0.37.0WithLabelsSince v0.37.0
Lifecycle Options¶
WithLifecycleHooksSince v0.38.0WithAdditionalLifecycleHooksSince v0.38.0WithStartupCommandSince v0.25.0WithAfterReadyCommandSince v0.28.0
Files & Mounts Options¶
WithFilesSince v0.37.0WithMountsSince v0.37.0WithTmpfsSince v0.37.0WithImageMountSince v0.37.0
Build Options¶
WithDockerfileSince v0.37.0
Logging Options¶
WithLogConsumersSince v0.28.0WithLogConsumerConfigSince v0.38.0WithLoggerSince v0.29.0
Image Options¶
WithAlwaysPullSince v0.38.0WithImageSubstitutorsSince v0.26.0WithImagePlatformSince v0.38.0
Networking Options¶
WithNetworkSince v0.27.0WithNetworkByNameSince v0.38.0WithBridgeNetworkSince v0.38.0WithNewNetworkSince v0.27.0
Advanced Options¶
WithHostPortAccessSince v0.31.0WithConfigModifierSince v0.20.0WithHostConfigModifierSince v0.20.0WithEndpointSettingsModifierSince v0.20.0CustomizeRequestSince v0.20.0WithNameSince v0.38.0WithNoStartSince v0.38.0WithProviderNot available until the next release main
Experimental Options¶
WithReuseByNameSince v0.37.0
Container Methods¶
The Docker Model Runner container exposes the following methods:
PullModel¶
- Since v0.37.0
Use the PullModel method to pull a model from the Docker Model Runner. Make sure the passed context is not done before the pull operation is completed, so that the pull operation is cancelled.
const (
modelNamespace = "ai"
modelName = "smollm2"
modelTag = "360M-Q4_K_M"
fqModelName = modelNamespace + "/" + modelName + ":" + modelTag
)
ctx, cancel := context.WithTimeout(ctx, 60*time.Second)
defer cancel()
err = dmrCtr.PullModel(ctx, fqModelName)
if err != nil {
log.Printf("failed to pull model: %s", err)
return
}
Info
You can find a curated collection of cutting-edge AI models as OCI Artifacts, from lightweight on-device models to high-performance LLMs on Docker Hub.
InspectModel¶
- Since v0.37.0
Use the InspectModel method to inspect a model from the Docker Model Runner, by providing the model namespace and name.
err = dmrCtr.PullModel(ctx, modelNamespace+"/"+modelName+":"+modelTag)
if err != nil {
log.Printf("failed to pull model: %s", err)
return
}
model, err := dmrCtr.InspectModel(ctx, modelNamespace, modelName+":"+modelTag)
if err != nil {
log.Printf("failed to get model: %s", err)
return
}
The namespace and name of the model is in the format of <name>:<tag>, which defines Models as OCI Artifacts in Docker Hub, therefore the namespace is the organization and the name is the repository.
E.g. ai/smollm2:360M-Q4_K_M. See Models as OCI Artifacts for more information.
ListModels¶
- Since v0.37.0
Use the ListModels method to list all models that are already pulled locally, using the Docker Model Runner format.
err = dmrCtr.PullModel(ctx, modelNamespace+"/"+modelName+":"+modelTag)
if err != nil {
log.Printf("failed to pull model: %s", err)
return
}
models, err := dmrCtr.ListModels(ctx)
if err != nil {
log.Printf("failed to get model: %s", err)
return
}