Closed
Description
- Version: tested v8.5.0 / master
- Subsystem: core
from discussion in #16876
in Node.js,
new (require('util').TextDecoder)('utf-8')
.decode(Buffer.from([0xF0, 0x80, 0x80])).length === 1 // U+FFFD
But in Safari/FF/Chrome,
new TextDecoder('utf-8')
.decode(new Uint8Array([0xF0, 0x80, 0x80])).length === 3 // U+FFFD U+FFFD U+FFFD
think Node.js is wrong here per https://encoding.spec.whatwg.org/#utf-8-decoder
//cc @mathiasbynens