Skip to content

Commit 8483900

Browse files
authored
Updated queue/concurrency docs for v4 (#2418)
1 parent 4264fc0 commit 8483900

File tree

1 file changed

+17
-98
lines changed

1 file changed

+17
-98
lines changed

docs/queue-concurrency.mdx

Lines changed: 17 additions & 98 deletions
Original file line numberDiff line numberDiff line change
@@ -3,25 +3,24 @@ title: "Concurrency & Queues"
33
description: "Configure what you want to happen when there is more than one run at a time."
44
---
55

6-
When you trigger a task, it isn't executed immediately. Instead, the task [run](/runs) is placed into a queue for execution. By default, each task gets its own queue with unbounded concurrency—meaning the task runs as soon as resources are available, subject only to the overall concurrency limits of your environment. If you need more control (for example, to limit concurrency or share limits across multiple tasks), you can define a custom queue as described later in this document.
6+
When you trigger a task, it isn't executed immediately. Instead, the task [run](/runs) is placed into a queue for execution.
7+
8+
By default, each task gets its own queue and the concurrency is only limited by your environment concurrency limit. If you need more control (for example, to limit concurrency or share limits across multiple tasks), you can define a custom queue as described later.
79

810
Controlling concurrency is useful when you have a task that can't be run concurrently, or when you want to limit the number of runs to avoid overloading a resource.
911

1012
It's important to note that only actively executing runs count towards concurrency limits. Runs that are delayed or waiting in a queue do not consume concurrency slots until they begin execution.
1113

1214
## Default concurrency
1315

14-
By default, all tasks have an unbounded concurrency limit, limited only by the overall concurrency limits of your environment. This means that each task could possibly "fill up" the entire
15-
concurrency limit of your environment.
16-
17-
Each individual queue has a maximum concurrency limit equal to your environment's base concurrency limit. If you don't explicitly set a queue's concurrency limit, it will default to your environment's base concurrency limit.
16+
By default, all tasks have an unbounded concurrency limit, limited only by the overall concurrency limits of your environment.
1817

1918
<Note>
2019
Your environment has a base concurrency limit and a burstable limit (default burst factor of 2.0x
2120
the base limit). Individual queues are limited by the base concurrency limit, not the burstable
2221
limit. For example, if your base limit is 10, your environment can burst up to 20 concurrent runs,
2322
but any single queue can have at most 10 concurrent runs. If you're a paying customer you can
24-
request higher limits by [contacting us](https://www.trigger.dev/contact).
23+
request higher burst limits by [contacting us](https://www.trigger.dev/contact).
2524
</Note>
2625

2726
## Setting task concurrency
@@ -72,11 +71,11 @@ export const task2 = task({
7271

7372
In this example, `task1` and `task2` share the same queue, so only one of them can run at a time.
7473

75-
## Setting the concurrency when you trigger a run
74+
## Setting the queue when you trigger a run
7675

77-
When you trigger a task you can override the concurrency limit. This is really useful if you sometimes have high priority runs.
76+
When you trigger a task you can override the default queue. This is really useful if you sometimes have high priority runs.
7877

79-
The task:
78+
The task and queue definition:
8079

8180
```ts /trigger/override-concurrency.ts
8281
const paidQueue = queue({
@@ -96,7 +95,7 @@ export const generatePullRequest = task({
9695
});
9796
```
9897

99-
Triggering from your backend and overriding the concurrency:
98+
Triggering from your backend and overriding the queue:
10099

101100
```ts app/api/push/route.ts
102101
import { generatePullRequest } from "~/trigger/override-concurrency";
@@ -105,15 +104,15 @@ export async function POST(request: Request) {
105104
const data = await request.json();
106105

107106
if (data.branch === "main") {
108-
//trigger the task, with a different queue
107+
//trigger the task, with the paid users queue
109108
const handle = await generatePullRequest.trigger(data, {
110109
// Set the paid users queue
111110
queue: "paid-users",
112111
});
113112

114113
return Response.json(handle);
115114
} else {
116-
//triggered with the default (concurrency of 1)
115+
//triggered with the default queue (concurrency of 1)
117116
const handle = await generatePullRequest.trigger(data);
118117
return Response.json(handle);
119118
}
@@ -124,7 +123,7 @@ export async function POST(request: Request) {
124123

125124
If you're building an application where you want to run tasks for your users, you might want a separate queue for each of your users (or orgs, projects, etc.).
126125

127-
You can do this by using `concurrencyKey`. It creates a separate queue for each value of the key.
126+
You can do this by using `concurrencyKey`. It creates a copy of the queue for each unique value of the key.
128127

129128
Your backend code:
130129

@@ -135,18 +134,20 @@ export async function POST(request: Request) {
135134
const data = await request.json();
136135

137136
if (data.isFreeUser) {
138-
//free users can only have 1 PR generated at a time
137+
//the "free-users" queue has a concurrency limit of 1
139138
const handle = await generatePullRequest.trigger(data, {
140139
queue: "free-users",
140+
//this creates a free-users queue for each user
141141
concurrencyKey: data.userId,
142142
});
143143

144144
//return a success response with the handle
145145
return Response.json(handle);
146146
} else {
147-
//trigger the task, with a different queue
147+
//the "paid-users" queue has a concurrency limit of 10
148148
const handle = await generatePullRequest.trigger(data, {
149149
queue: "paid-users",
150+
//this creates a paid-users queue for each user
150151
concurrencyKey: data.userId,
151152
});
152153

@@ -158,7 +159,7 @@ export async function POST(request: Request) {
158159

159160
## Concurrency and subtasks
160161

161-
When you trigger a task that has subtasks, the subtasks will not inherit the concurrency settings of the parent task. Unless otherwise specified, subtasks will run on their own queue
162+
When you trigger a task that has subtasks, the subtasks will not inherit the queue from the parent task. Unless otherwise specified, subtasks will run on their own queue
162163

163164
```ts /trigger/subtasks.ts
164165
export const parentTask = task({
@@ -198,11 +199,6 @@ For example, if you have a queue with a `concurrencyLimit` of 1:
198199
- When the executing run reaches a waitpoint and checkpoints, it releases its slot
199200
- The next queued run can then begin execution
200201

201-
<Note>
202-
We sometimes refer to the parent task as the "parent" and the subtask as the "child". Subtask and
203-
child task are used interchangeably. We apologize for the confusion.
204-
</Note>
205-
206202
### Waiting for a subtask on a different queue
207203

208204
When a parent task triggers and waits for a subtask on a different queue, the parent task will checkpoint and release its concurrency slot once it reaches the wait point. This prevents environment deadlocks where all concurrency slots would be occupied by waiting tasks.
@@ -230,80 +226,3 @@ export const subtask = task({
230226
```
231227

232228
When the parent task reaches the `triggerAndWait` call, it checkpoints and transitions to the `WAITING` state, releasing its concurrency slot back to both its queue and the environment. Once the subtask completes, the parent task will resume and re-acquire a concurrency slot.
233-
234-
### Waiting for a subtask on the same queue
235-
236-
When a parent task and subtask share the same queue, the checkpointing behavior ensures that recursive task execution can proceed without deadlocks, up to the queue's concurrency limit.
237-
238-
```ts /trigger/waiting-same-queue.ts
239-
export const myQueue = queue({
240-
name: "my-queue",
241-
concurrencyLimit: 1,
242-
});
243-
244-
export const parentTask = task({
245-
id: "parent-task",
246-
queue: myQueue,
247-
run: async (payload) => {
248-
//trigger a subtask and wait for it to complete
249-
await subtask.triggerAndWait(payload);
250-
},
251-
});
252-
253-
export const subtask = task({
254-
id: "subtask",
255-
queue: myQueue,
256-
run: async (payload) => {
257-
//...
258-
},
259-
});
260-
```
261-
262-
When the parent task checkpoints at the `triggerAndWait` call, it releases its concurrency slot back to the queue, allowing the subtask to execute. Once the subtask completes, the parent task will resume.
263-
264-
However, you can only have recursive waits up to your queue's concurrency limit. If you exceed this limit, you will receive a `RECURSIVE_WAIT_DEADLOCK` error:
265-
266-
```ts /trigger/deadlock.ts
267-
export const myQueue = queue({
268-
name: "my-queue",
269-
concurrencyLimit: 1,
270-
});
271-
272-
export const parentTask = task({
273-
id: "parent-task",
274-
queue: myQueue,
275-
run: async (payload) => {
276-
await subtask.triggerAndWait(payload);
277-
},
278-
});
279-
280-
export const subtask = task({
281-
id: "subtask",
282-
queue: myQueue,
283-
run: async (payload) => {
284-
await subsubtask.triggerAndWait(payload); // This will cause a deadlock
285-
},
286-
});
287-
288-
export const subsubtask = task({
289-
id: "subsubtask",
290-
queue: myQueue,
291-
run: async (payload) => {
292-
//...
293-
},
294-
});
295-
```
296-
297-
This results in a `RECURSIVE_WAIT_DEADLOCK` error because the queue can only support one level of recursive waiting with a concurrency limit of 1:
298-
299-
![Recursive task deadlock](/images/recursive-task-deadlock-min.png)
300-
301-
### Mitigating recursive wait deadlocks
302-
303-
To avoid recursive wait deadlocks when using shared queues:
304-
305-
1. **Increase the queue's concurrency limit** to allow more levels of recursive waiting
306-
2. **Use different queues** for parent and child tasks to eliminate the possibility of deadlock
307-
3. **Design task hierarchies** to minimize deep recursive waiting patterns
308-
309-
Remember that the number of recursive waits you can have on a shared queue is limited by that queue's concurrency limit.

0 commit comments

Comments
 (0)