Skip to content

TextDecoder('utf-8') doesn’t match spec #16894

Closed
@srl295

Description

@srl295
  • Version: tested v8.5.0 / master
  • Subsystem: core

from discussion in #16876

in Node.js,

new (require('util').TextDecoder)('utf-8')
 .decode(Buffer.from([0xF0, 0x80, 0x80])).length === 1 // U+FFFD

But in Safari/FF/Chrome,

new TextDecoder('utf-8')
 .decode(new Uint8Array([0xF0, 0x80, 0x80])).length === 3 // U+FFFD U+FFFD U+FFFD

think Node.js is wrong here per https://encoding.spec.whatwg.org/#utf-8-decoder

//cc @mathiasbynens

Metadata

Metadata

Assignees

No one assigned

    Labels

    c++Issues and PRs that require attention from people who are familiar with C++.i18n-apiIssues and PRs related to the i18n implementation.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions