Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit 22b0653

Browse files
Minor fixes to quick-tour.mdx (#139)
* Update quick-tour.mdx * Update src/content/index/quick-tour.mdx Co-authored-by: Jeannie Finks <[email protected]> * Update src/content/index/quick-tour.mdx Co-authored-by: Jeannie Finks <[email protected]> Co-authored-by: Jeannie Finks <[email protected]>
1 parent b91452a commit 22b0653

File tree

1 file changed

+8
-9
lines changed

1 file changed

+8
-9
lines changed

src/content/index/quick-tour.mdx

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -163,7 +163,7 @@ from deepsparse.pipelines.custom_pipeline import CustomTaskPipeline
163163

164164
def preprocess(inputs):
165165
pass # define your function
166-
def postprocess(outputs)
166+
def postprocess(outputs):
167167
pass # define your function
168168

169169
custom_pipeline = CustomTaskPipeline(
@@ -182,7 +182,7 @@ pipeline_outputs = custom_pipeline(pipeline_inputs)
182182
**Additional Resources**
183183

184184
- Get Started and [Use A Model](/get-started/use-a-model)
185-
- Get Started and [Use A Model in a Custom Use Case)](/get-started/use-a-model/custom-use-case)
185+
- Get Started and [Use A Model in a Custom Use Case](/get-started/use-a-model/custom-use-case)
186186
- Refer to [Use Cases](/use-cases) for details on usage of supported use cases
187187
- List of Supported Use Cases [Docs Coming Soon]
188188

@@ -207,20 +207,19 @@ predictions.
207207

208208
DeepSparse Server is launched from the CLI, with configuration via either command line arguments or a configuration file.
209209

210-
With the command line argument path, users specify a use case via the `task` argument (e.g. `image_classification` or `question_answering`) as
210+
With the command line argument path, users specify a use case via the `task` argument (e.g., `image_classification` or `question_answering`) as
211211
well as a model (either a local ONNX file or a SparseZoo stub) via the `model_path` argument:
212212
```bash
213-
deepsparse.server task [use_case_name] --model_path [model_path]
213+
deepsparse.server --task [use_case_name] --model_path [model_path]
214214
```
215215

216216
With the config file path, users create a YAML file that specifies the server configuration. A YAML file looks like the following:
217217

218218
```yaml
219-
num_workers: 4 # specify multi-stream (more than one worker)
220219
endpoints:
221-
- task: [task_name] # specifiy use case (e.g. image_classification, question_answering)
220+
- task: task_name # specifiy use case (e.g., image_classification, question_answering)
222221
route: /predict # specify the route of the endpoint
223-
model: [model_path] # specify sparsezoo stub or path to local onnx file
222+
model: model_path # specify sparsezoo stub or path to local onnx file
224223
name: any_name_you_want
225224

226225
# - ... add as many endpoints as neeede
@@ -229,7 +228,7 @@ endpoints:
229228
The Server is then launched with the following:
230229

231230
```bash
232-
deepsparse.server config_file config.yaml
231+
deepsparse.server --config_file config.yaml
233232
```
234233

235234
Clients interact with the Server via HTTP. Because the Server uses Pipelines internally,
@@ -284,7 +283,7 @@ onnx_filepath = "path/to/onnx/model.onnx"
284283
batch_size = 64
285284

286285
# Generate random sample input
287-
inputs = generate_random_inputs(model=onnx_filepath, batch_size=batch_size)
286+
inputs = generate_random_inputs(onnx_filepath, batch_size)
288287

289288
# Compile and run
290289
engine = Engine(onnx_filepath, batch_size)

0 commit comments

Comments
 (0)