Bun: SQLite and Zero-Overhead FFI
It's 2023, and Bun has finally hit 1.0. While everyone is talking about the test runner and the package manager, the real revolution is under the hood: how Bun handles the bridge between JavaScript and native code.
The Problem with Node.js N-API
In Node.js, calling a C++ function from JavaScript involves significant overhead. Data often has to be copied or serialized, and the transition across the "boundary" is slow. This is why native modules in Node often don't feel as fast as they should.
Bun's Secret: Zig and JSC
Bun is written in Zig and uses JavaScriptCore (JSC) instead of V8. This combination allows Jarred Sumner to implement Zero-Overhead FFI (Foreign Function Interface).
Native SQLite
One of the best examples of this is bun:sqlite. It's not a wrapper around a WASM build; it's a native binding that is up to 10x faster than better-sqlite3.
import { Database } from "bun:sqlite";
const db = new Database(":memory:");
const query = db.query("SELECT 'Hello world' as message;");
console.log(query.get().message); // "Hello world"
// High performance inserts
const insert = db.prepare("INSERT INTO users (name) VALUES ($name)");
const insertMany = db.transaction(users => {
for (const user of users) insert.run(user);
});
insertMany([{ $name: "Alice" }, { $name: "Bob" }]);
Raw FFI
If you have a custom C library, Bun allows you to call it directly with almost no boilerplate.
import { dlopen, FFIType, ptr } from "bun:ffi";
const lib = dlopen("libmymath.so", {
add: {
args: [FFIType.i32, FFIType.i32],
returns: FFIType.i32,
},
});
console.log(lib.symbols.add(10, 20)); // 30
Why It Matters
In 2023, we're building more "heavy" logic in JavaScript. Whether it's processing millions of rows in a local database or running ML inference, the bridge between JS and the OS needs to be transparent. Bun is the first runtime to make native-speed interop a first-class citizen.