Replies: 1 comment
-
|
Thanks for the report! Smaller local models (8B–14B) usually don't learn a new format from a long prose description. In practice it works much better if you show a tiny TOON example and ask the model to copy that pattern, instead of explaining the whole spec. For example, you can start with a prompt like this: You will receive data in TOON format (YAML-like with explicit array headers).
Example:
```toon
users[3]{id,name,role}:
1,Alice,admin
2,Bob,user
3,Charlie,user
```
Now here is the real data:
```toon
{{YOUR_TOON_DATA_HERE}}
```
Task: {{YOUR_QUESTION_OR_TRANSFORMATION}}.
If you return data, return it only as a ```toon code block using the same structure.Keeping the example small (2–5 rows) and very concrete helps a lot, especially on 8B/14B models. For more patterns and ready‑made prompt templates, see the Using TOON with LLMs guide in the docs. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I tried the description on the official website, but most AI still have a poor understanding, and even the low-level models of Olama 8b 14b are almost unusable.
How can I write prompt words to enable AI to understand TOON format?
Beta Was this translation helpful? Give feedback.
All reactions