I found out that this code https://github.com/niieani/gpt-tokenizer/blob/main/src/mapping.ts#L50-L58 , removes the o3-mini from the chat supported map, which results it the following error: Error: Model 'o3-mini' does not support chat. when calling encodeChat. Unfortunately, I do not know which params would be correct for this model, thus unable to open a PR, but it seems that patching this chatModelParams to use params from gpt4 will make it work.
I found out that this code https://github.com/niieani/gpt-tokenizer/blob/main/src/mapping.ts#L50-L58 , removes the o3-mini from the chat supported map, which results it the following error:
Error: Model 'o3-mini' does not support chat.when calling encodeChat. Unfortunately, I do not know which params would be correct for this model, thus unable to open a PR, but it seems that patching thischatModelParamsto use params from gpt4 will make it work.