Weirdness in JS
While writing an SQL parser, noticed some weirdness when setting fields in an object with boolean/null keys. They can be accessed using a[true]
or a.true
or a['true']
. This is because the key is converted to a string(when we access).
javascript
const a = {
true: '1',
false: '0',
null: 'null'
}
console.log(a[true], a.true, a['true']) // 1
console.log(a[false], a.false, a['false']) // 0
console.log(a[null], a.null, a['null']) // null
This also has the same behavior when the key is computed.
javascript
const a = {
[true]: '1',
[false]: '0',
[null]: 'null'
}
console.log(a[true], a.true, a['true']) // 1
console.log(a[false], a.false, a['false']) // 0
console.log(a[null], a.null, a['null']) // null
This is same as the first one -
javascript
const a = {
['true']: '1',
['false']: '0',
['null']: 'null'
}
console.log(a[true], a.true, a['true']) // 1
console.log(a[false], a.false, a['false']) // 0
console.log(a[null], a.null, a['null']) // null
So before accessing a field, ideally the type should be checked for string, unless its intended.
javascript
if(typeof key === 'string') {
console.log(a[key])
} else {
console.log(a[key.toString()])
}
This is not really in issue in typescript as it throws an error when the key is not a string or symbol. Unless any
is used.