v8 Engine - Why is calling native code from JS so

2020-07-27 05:07发布

问题:

Based on multiple answers to other questions, calling native C++ from Javascript is expensive.

I checked myself with the node module "benchmark" and came to the same conclusion.

A simple JS function can get ~90 000 000 calls directly, when calling a C++ function I can get a maximum of about 25 000 000 calls. That in itself is not that bad.

But when adding the creation of an object the JS still is about 70 000 000 calls/sec, but the native version suffers dramatically and goes down to about 2 000 000.

I assume this has todo with the dynamic nature of how the v8 engine works, and that it compiles the JS code to byte code.

But what keeps them from implementing the same optimizations for the C++ code? (or at least calling / insight into what would help there)

回答1:

(V8 developer here.) Without seeing the code that you ran, it's hard to be entirely sure what effect you were observing, and based on your descriptions I can't reproduce it. Microbenchmarks in particular tend to be tricky, and the relative speedups or slowdowns they appear to be measuring are often misleading, unless you've verified that what happens under the hood is exactly what you expect to be happening. For example, it could be the case that the optimizing compiler was able to eliminate the entire workload because it could statically prove that the result isn't used anywhere. Or it could be the case that no calls were happening at all, because the compiler chose to inline the callee.

Generally speaking, crossing the JS/C++ boundary is what has a certain cost, due to different calling conventions and some other checks and preparations that need to be done, like checking for exceptions that may have been thrown. Both one JavaScript function calling another, and one C++ function calling another, will be faster than JavaScript calling into C++ or the other way round.

This boundary crossing cost is unrelated to the level of compiler optimization on either side. It's also unrelated to byte code. ("Hot", i.e. frequently executed, JavaScript functions are compiled to machine code anyway.)

Lastly, V8 is not a C++ compiler. It's simply not built to do any optimizations for C++ code. And even if it tried to, there's no reason to assume it could do a better job than your existing C++ compiler with -O3. (V8 also doesn't even see the source code of your C++ module, so before you could experiment with recompiling that, you'd have to figure out how to provide that source.)



回答2:

Without delving into specific V8 versions and their intrinsic reasons, I can say that the overhead is not the in the way the C++ backend works vs. the Javascript, instead the pathway between the languages - that is, the binary interface which implements the invocation of a native method from the Javascript land, and vice versa.

The operations involved in a cross-invocation, in my understanding are:

  1. Prepare the arguments.
  2. Save the JS context.
  3. Invoke a gate code [ which implements the bridge ]
  4. The bridge translates the arguments into C++ style params
  5. The bridge also translates the calling convention to match C++
  6. Invokes a C++ Runtime API wrapper in V8.
  7. This wrapper calls the actual method to perform the operation.
  8. The same is reversely performed when the C++ function returns.

May be there are additional steps involved here, but I guess this in itself suffices to explain why the overhead surfaces.

Now, coming to JS optimizations: the JIT compiler which comes with the V8 engine has 2 parts to it: the first just converts the script into machine code, and the second optimizes the code based on collected profile information. This, is a purely dynamic event and a great, unique opportunity which a C++ compiler cannot match, which works in the static compilation space. For example, the information that an Object is created and destroyed in a block of JS code without escaping its scope outside the block would cause the JIT version to optimize the object creation, such stack allocation (OSR), whereas this will always be in the JS heap, when the native version is invoked.

Thanks for bringing this up, it is an interesting conversation!



标签: node.js v8