Version: 2.6.2
Calling encode on a very large string of text (429,600,824 characters) throws this error:
RangeError: Invalid array length
This is how I'm using it:
import { encode } from "gpt-tokenizer";
console.log(encode("very long string of text").length);
Is there a limit to the string length that encode can handle?
I'm using this to calculate how many tokens my string is, to calculate OpenAI embeddings pricing
Version:
2.6.2Calling
encodeon a very large string of text (429,600,824 characters) throws this error:This is how I'm using it:
Is there a limit to the string length that
encodecan handle?I'm using this to calculate how many tokens my string is, to calculate OpenAI embeddings pricing