@@ -49,7 +49,7 @@ shows how Docker ties them all together with the following tools:
49
49
- [ Docker Offload] ( /offload/ ) provides a powerful, GPU-accelerated
50
50
environment to run your AI applications with the same Compose-based
51
51
workflow you use locally.
52
- - [ Docker Compose] ( .. /manuals/ai/compose/model-runner .md/ ) is the tool that ties it all
52
+ - [ Docker Compose] ( /manuals/ai/compose/models-and-compose .md ) is the tool that ties it all
53
53
together, letting you define and run multi-container applications with a
54
54
single file.
55
55
@@ -64,7 +64,7 @@ works together.
64
64
To follow this guide, you need to:
65
65
66
66
- [ Install Docker Desktop 4.43 or later] ( ../get-started/get-docker.md )
67
- - [ Enable Docker Model Runner] ( /ai/model-runner/ #enable-dmr-in-docker-desktop )
67
+ - [ Enable Docker Model Runner] ( /manuals/ ai/model-runner.md #enable-dmr-in-docker-desktop )
68
68
- [ Join Docker Offload Beta] ( /offload/quickstart/ )
69
69
70
70
## Step 1: Clone the sample application
@@ -384,7 +384,7 @@ that support local and cloud-based agentic AI development:
384
384
tool integrations that follow the Model Context Protocol (MCP) standard.
385
385
- [ Docker MCP Gateway] ( /ai/mcp-gateway/ ) : Orchestrate and manage
386
386
MCP servers to connect agents to external tools and services.
387
- - [ Docker Compose] ( .. /manuals/ai/compose/model-runner .md) : Define and run
387
+ - [ Docker Compose] ( /manuals/ai/compose/models-and-compose .md ) : Define and run
388
388
multi-container agentic AI applications with a single file, using the same
389
389
workflow locally and in the cloud.
390
390
- [ Docker Offload] ( /offload/ ) : Run GPU-intensive AI workloads in a secure, managed
0 commit comments