var romanToInt = function(s) {
value = 0;
for (let i = 0; i < s.length; i += 1) {
symbols[s[i]] < symbols[s[i + 1]] ? value -= symbols[s[i]] : value += symbols[s[i]]
}
return value
};
This is an leetcode exmple, I am confused with this condition symbols[s[i]] < symbols[s[i+1]] ?
,I don't understand why s[i+1] won't out of range? or it is out of range but considered as false?
var romanToInt = function(s) {
value = 0;
for (let i = 0; i < s.length; i += 1) {
symbols[s[i]] < symbols[s[i + 1]] ? value -= symbols[s[i]] : value += symbols[s[i]]
}
return value
};
This is an leetcode exmple, I am confused with this condition symbols[s[i]] < symbols[s[i+1]] ?
,I don't understand why s[i+1] won't out of range? or it is out of range but considered as false?
- 2 Wele to the wonderful world of JavaScript ;-) – trincot Commented Jun 25, 2020 at 11:16
-
1
out of range means in javascript
undefined
– Ilijanovic Commented Jun 25, 2020 at 11:16 -
What have you tried to resolve this? Have you checked whatever
s[i + 1]
contains? – Nico Haase Commented Jun 25, 2020 at 11:26
4 Answers
Reset to default 6In javascript, arrays are objects, so there's no such thing as out of range in javascript arrays. Array indexes are stored as keys. If the index doesn't exists in the array, undefined
is returned.
const arr = [1, 2, 3];
console.log(typeof arr); // output: "object"
console.log(arr[4]); // output: undefined
Array on MDN
JavaScript arrays are zero-indexed: the first element of an array is at index 0, and the last element is at the index equal to the value of the array's length property minus 1.
Using an invalid index number returns undefined.
Emphasis added.
In your case, any parison of <
or >
with a number and undefined
is false.
Javascript arrays are just objects, and indexes are like keys, as for any other field.
You can witness Javascript behavior here:
let a = [ 0, 1, 2 ];
for (let i = 0; i < 5; i++){
console.log(a[i]);
}
console.log("Same for other fields:")
console.log(a.foo);
console.log(a.bar);
a.foo = "hi!"
console.log(a.foo);
In JavaScript, arrays are declared without a fixed size. Hence s[i+1] won't be out of range, it will just be undefined. Hence the final output will be false.