This will not meaningfully affect short array literals, but it does
give us a bit of extra perf when evaluating huge array expressions like
in Kraken/imaging-darkroom.js
When deciding whether we need to create a full-blown `arguments` object,
we look at various things, starting as early as in the parser.
Until now, if the parser saw the identifier `arguments`, we'd decide
that it's enough of a clue that we should create the `arguments` object
since somebody is obviously accessing it.
However, that missed the case where someone is just accessing a property
named `arguments` on some object. In such cases (`o.arguments`), we now
hold off on creating an `arguments` object.
~11% speed-up on Octane/typescript.js :^)
Previously, certain crafted input could cause the JS parser to hang, as
it repeatedly tried to parse an EOF token after hitting an "invalid
destructuring assignment target" error. This change ensures that we
stop parsing after hitting this error condition.
Array.length is magical (since it has to reflect the number of elements
in the object's property storage).
We now handle it specially in jitted code, giving us a massive speed-up
on Kraken/ai-astar.js (and probably many other things as well) :^)
Until now, the unwind context stack has not been maintained by jitted
code, which meant we were unable to support the `with` statement.
This is a first step towards supporting that by making jitted code
call out to C++ to update the unwind context stack when entering/leaving
unwind contexts.
We also introduce a new "Catch" bytecode instruction that moves the
current exception into the accumulator. It's always emitted at the start
of a "catch" block.
This patch makes it possible for JS::Object::internal_set() to populate
a CacheablePropertyMetadata, and uses this to implement a basic
monomorphic cache for the most common form of property write access.
If the property for GetByValue in Generator::load_from_reference
is a calculated value this would be stored in an allocated
register and returned from the function. Not all callers want
this information however, so now only give it out when asked for.
Reduced the instruction count for Kraken/ai-astar.js function
"neighbours" from 214 to 192.
Before usage of GetGlobal was prevented whenever eval() is present in
the scope chain.
With this change GetGlobal is emitted for `g` in the following program:
```js
function screw_everything_up() {
eval("");
}
var g;
g;
```
It makes Octane/mandreel.js benchmark run 2x faster :)
Normally, we want to avoid accidentally using such atomics, since
they're much slower. In this case however, we're just implementing
another atomics API, it is then up to the JavaScript code to avoid
using the slow atomics.