You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+18-18Lines changed: 18 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,7 +20,7 @@ The platform consists of two main components:
20
20
21
21
Most users only need to interact with the `nemo-evaluator-launcher` as universal gateway to different benchmarks and harnesses. It is however possible to interact directly with `nemo-evaluator` by following this [guide](./docs/nemo-evaluator/workflows/using-containers.md).
22
22
23
-
```mermaid
23
+
```{mermaid}
24
24
graph TD
25
25
A[User] --> B{NeMo Evaluator Launcher};
26
26
B -- " " --> C{Local};
@@ -104,23 +104,23 @@ NeMo Evaluator provides pre-built evaluation containers for different evaluation
104
104
105
105
| Container | Description | NGC Catalog | Latest Tag | Supported benchmarks |
Copy file name to clipboardExpand all lines: docs/nemo-evaluator-launcher/configuration/deployment/index.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,4 +25,4 @@ deployment:
25
25
26
26
## Configuration Files
27
27
28
-
See all available deployment configurations: [Deployment Configs](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/tree/main/nemo_evaluator_launcher/src/nemo_evaluator_launcher/configs/deployment?ref_type=heads)
28
+
See all available deployment configurations: [Deployment Configs](../../../../packages/nemo-evaluator-launcher/src/nemo_evaluator_launcher/configs/deployment)
See the complete configuration structure in the [NIM Config File](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/src/nemo_evaluator_launcher/configs/deployment/nim.yaml?ref_type=heads).
7
+
See the complete configuration structure in the [NIM Config File](../../../../packages/nemo-evaluator-launcher/src/nemo_evaluator_launcher/configs/deployment/nim.yaml).
8
8
9
9
## Key Settings
10
10
@@ -17,10 +17,10 @@ Tips:
17
17
- You do not need to adjust params like tensor/data parallelism NIM should pick the best set up based on your hardware.
18
18
19
19
Examples:
20
-
-[Lepton NIM Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/lepton_nim_llama_3_1_8b_instruct.yaml?ref_type=heads) - NIM deployment on Lepton platform
20
+
-[Lepton NIM Example](../../../../packages/nemo-evaluator-launcher/examples/lepton_nim_llama_3_1_8b_instruct.yaml) - NIM deployment on Lepton platform
Copy file name to clipboardExpand all lines: docs/nemo-evaluator-launcher/configuration/deployment/none.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,11 +27,11 @@ target:
27
27
- If your model does not require an API key, you can skip the `api_key` field entirely
28
28
29
29
Examples:
30
-
- [Local None Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/local_llama_3_1_8b_instruct.yaml?ref_type=heads) - Local evaluation with existing endpoint
31
-
- [Lepton None Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/lepton_none_llama_3_1_8b_instruct.yaml?ref_type=heads) - Lepton evaluation with existing endpoint
32
-
- [Slurm None Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/slurm_no_deployment_llama_3_1_8b_instruct.yaml?ref_type=heads) - Slurm evaluation with existing endpoint
33
-
- [Local with Metadata](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/local_with_user_provided_metadata.yaml?ref_type=heads) - Local evaluation with custom metadata
34
-
- [Auto Export Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/local_auto_export_llama_3_1_8b_instruct.yaml?ref_type=heads) - Local evaluation with automatic result export
30
+
- [Local None Example](../../../../packages/nemo-evaluator-launcher/examples/local_llama_3_1_8b_instruct.yaml) - Local evaluation with existing endpoint
31
+
- [Lepton None Example](../../../../packages/nemo-evaluator-launcher/examples/lepton_none_llama_3_1_8b_instruct.yaml) - Lepton evaluation with existing endpoint
32
+
- [Slurm None Example](../../../../packages/nemo-evaluator-launcher/examples/slurm_no_deployment_llama_3_1_8b_instruct.yaml) - Slurm evaluation with existing endpoint
33
+
- [Local with Metadata](../../../../packages/nemo-evaluator-launcher/examples/local_with_user_provided_metadata.yaml) - Local evaluation with custom metadata
34
+
- [Auto Export Example](../../../../packages/nemo-evaluator-launcher/examples/local_auto_export_llama_3_1_8b_instruct.yaml) - Local evaluation with automatic result export
35
35
36
36
## Use Cases
37
37
@@ -43,4 +43,4 @@ This deployment option is useful when:
Copy file name to clipboardExpand all lines: docs/nemo-evaluator-launcher/configuration/deployment/sglang.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ SGLang is a fast serving framework for large language models and vision language
4
4
5
5
## Configuration
6
6
7
-
See the complete configuration structure in the [SGLang Config File](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/src/nemo_evaluator_launcher/configs/deployment/sglang.yaml?ref_type=heads).
7
+
See the complete configuration structure in the [SGLang Config File](../../../../packages/nemo-evaluator-launcher/src/nemo_evaluator_launcher/configs/deployment/sglang.yaml).
Copy file name to clipboardExpand all lines: docs/nemo-evaluator-launcher/configuration/deployment/vllm.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ vLLM is a fast and easy-to-use library for LLM inference and serving.
4
4
5
5
## Configuration
6
6
7
-
See the complete configuration structure in the [vLLM Config File](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/src/nemo_evaluator_launcher/configs/deployment/vllm.yaml?ref_type=heads).
7
+
See the complete configuration structure in the [vLLM Config File](../../../../packages/nemo-evaluator-launcher/src/nemo_evaluator_launcher/configs/deployment/vllm.yaml).
8
8
9
9
## Key Settings
10
10
@@ -27,13 +27,13 @@ Tips:
27
27
- If `checkpoint_path` is provided instead, use that local path
28
28
29
29
Examples:
30
-
-[Lepton vLLM Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/lepton_vllm_llama_3_1_8b_instruct.yaml?ref_type=heads) - vLLM deployment on Lepton platform
31
-
-[Slurm vLLM Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/slurm_llama_3_1_8b_instruct.yaml?ref_type=heads) - vLLM deployment on Slurm cluster
32
-
-[Slurm vLLM HF Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/slurm_llama_3_1_8b_instruct_hf.yaml?ref_type=heads) - vLLM with Hugging Face model
33
-
-[Notebook API Example](https://gitlab-master.nvidia.com/dl/JoC/competitive_evaluation/nv-eval-platform/-/blob/main/nemo_evaluator_launcher/examples/notebooks/nv-eval-api.ipynb?ref_type=heads) - Python API usage with vLLM
30
+
-[Lepton vLLM Example](../../../../packages/nemo-evaluator-launcher/examples/lepton_vllm_llama_3_1_8b_instruct.yaml) - vLLM deployment on Lepton platform
31
+
-[Slurm vLLM Example](../../../../packages/nemo-evaluator-launcher/examples/slurm_llama_3_1_8b_instruct.yaml) - vLLM deployment on Slurm cluster
32
+
-[Slurm vLLM HF Example](../../../../packages/nemo-evaluator-launcher/examples/slurm_llama_3_1_8b_instruct_hf.yaml) - vLLM with Hugging Face model
33
+
-[Notebook API Example](../../../../packages/nemo-evaluator-launcher/examples/notebooks/nv-eval-api.ipynb) - Python API usage with vLLM
0 commit comments