> you write routes and logic in JavaScript, and the CLI compiles everything into a single Rust + Axum binary using the Boa JS engine
It's not clear to me how this would have better performance than plain Node.js, which is a C++ binary using the V8 JS engine.
It looks like you're handling routing in Rust, but this seems unlikely to move the needle measurably. In fact, it could be hurting you - you're basically betting that the rust program (route request + invoke JS interpreter + marshal data) is faster than the much simpler JavaScript program (route request). That doesn't seem likely.
This basically makes a rust server to do the routing, then uses the Boa JS engine to evaluate the JS to handle the route
With this approach, you might be able to do some multithreading to improve the throughput
However, each request is almost guaranteed to be slower because V8 will be faster than Boa
You could also achieve this by spinning up multiple NodeJS instances and putting an nginx server in front to do load balancing - which is pretty standard practice
> You could also achieve this by spinning up multiple NodeJS instances and putting an nginx server in front to do load balancing - which is pretty standard practice
I've done this in production plenty of times. Under load, nginx is insanely efficient. Practically all the CPU time ends up spent in your nodejs application server.
The worst part of a setup like this is deployment. There's just a lot of little moving pieces - like nginx needs to keep track of which frontend servers are up and which are down. How are you doing load balancing? You want to have websocket connections? That makes it more complex. How do you deploy code? Etc. Its great, but its not at all simple. Configuring nginx feels like its a little puzzle all of its own.
TBH, the idea seems way outdated for the current state of software engineering. The Rust compiler provides a massive benefit for AI Coding because it literally catches all the failure cases, so all AI have to do is implement the logical parts, which is usually a no-brainer for something like a Claude Code or Codex.
For example, the https://github.com/SaynaAI/sayna has been mostly Claude Code + me reviewing the stuff + some small manual touches if needed, but for the most part, I have found that Claude Code writes way more stable Rust code than JS.
It would be easier and safer to give the JS code to a translator and have it translate it into Rust, and then continue AI Dev with Rust, than to invest time in an automated compiler from JS to Rust. IMHO!
I’ve heard it said and I won’t argue your personal experience.
However, I don’t see it that way at all.
I find claude much more capable of writing large chunks of python or react/js frontend code than writing F#, a very statically type-checked language.
It’s fine, but a lot more hand-holding is needed, a lot more tar pits visited.
If anything, it seems a popularity contest of which language features the most in training data. If AI assistance is the goal, everyone should write Python and Javascript.
I’ve worked with relatively large projects in TypeScript, Python, C#, and Swift, and I’ve come to believe the more opinionated the language and framework, the better. C# .NET, despite being a monster, was a breath of fresh air after TS. Each iteration just worked. Each new feature simply gets implemented.
My experience also points to compiled languages that give immediate feedback on build. It’s nearly impossible to stop any AI agent from using 'as any' or 'as unknown as X'casts in TypeScript - LLMs will “fix” problems by sweeping them under the rug. The larger the codebase, the more review and supervision is required. TS codebase rots much faster then rust/C#/swift etc.
You can fix a lot of that with a strict tsconfig, Biome and a handful of claude.md rules, I’ve found. That said, it’s been ages since I wrote a line of C#, but it remains the most productive language I’ve used. My TypeScript productivity has only recently begun to approach it.
As the author of a JS-to-OCaml compiler [1], I must admit that Poe’s Law applies here [2]:
“Without a clear indicator of the author’s intent, any parodic or sarcastic expression of extreme views can be mistaken by some readers for a sincere expression of those views.”
Any benchmarking ? Because this fundamentally sounds like replacing node (V8) with another javascript engine. Which I'm not sure is going to be much of a gain, at which point why use an entire different toolchain than the rest of the world ?
I guess as long as you have basically no business logic, perhaps it makes sense to orchestrate route handling in Rust?
But Boa is very very slow compared to JIT compiled JavaScript. As soon as your business logic starts trying to stand up and walk I think you’ll start hitting request latency sadness.
While the idea is somewhat new for today's JS. And I see some benefits for this to make single-purpose servers compiled into tiny binaries. I believe it would take some time to make this popular. You should to find the niche where it's required right now. And also I would spend more time working on marketing: explainer,
documentation, landing page. For example now the readme looks too ai-written
What's about code and DX: it's not a good practice to export anything using globals, this is what JS world refused to do long ago. It turns your code into a hardly debuggable mess quickly
One of your merits listed is "Pure JavaScript developer experience". I don't think most devs, even JS devs would consider this a merit lol. Cool project either way.
It's not clear to me how this would have better performance than plain Node.js, which is a C++ binary using the V8 JS engine.
It looks like you're handling routing in Rust, but this seems unlikely to move the needle measurably. In fact, it could be hurting you - you're basically betting that the rust program (route request + invoke JS interpreter + marshal data) is faster than the much simpler JavaScript program (route request). That doesn't seem likely.
With this approach, you might be able to do some multithreading to improve the throughput
However, each request is almost guaranteed to be slower because V8 will be faster than Boa
You could also achieve this by spinning up multiple NodeJS instances and putting an nginx server in front to do load balancing - which is pretty standard practice
How does it compare in terms of HW resources?
The worst part of a setup like this is deployment. There's just a lot of little moving pieces - like nginx needs to keep track of which frontend servers are up and which are down. How are you doing load balancing? You want to have websocket connections? That makes it more complex. How do you deploy code? Etc. Its great, but its not at all simple. Configuring nginx feels like its a little puzzle all of its own.
https://bun.com/docs/bundler/executables
Also stating that Bun has no hot reload is just wrong:
https://bun.com/docs/guides/http/hot
A lot of the claims in the comparison table are highly debatable to put it gently.
"Pure JavaScript developer experience" with yes for Titan and node but no for Bun? If anything Titan's JS support is lagging behind for using Boa.
"Rust-level performance" with yes for Titan and no for all others. But it uses a slow JS engine called Boa. Slower than node and much slower than Bun.
For example, the https://github.com/SaynaAI/sayna has been mostly Claude Code + me reviewing the stuff + some small manual touches if needed, but for the most part, I have found that Claude Code writes way more stable Rust code than JS.
It would be easier and safer to give the JS code to a translator and have it translate it into Rust, and then continue AI Dev with Rust, than to invest time in an automated compiler from JS to Rust. IMHO!
However, I don’t see it that way at all. I find claude much more capable of writing large chunks of python or react/js frontend code than writing F#, a very statically type-checked language. It’s fine, but a lot more hand-holding is needed, a lot more tar pits visited.
If anything, it seems a popularity contest of which language features the most in training data. If AI assistance is the goal, everyone should write Python and Javascript.
My experience also points to compiled languages that give immediate feedback on build. It’s nearly impossible to stop any AI agent from using 'as any' or 'as unknown as X'casts in TypeScript - LLMs will “fix” problems by sweeping them under the rug. The larger the codebase, the more review and supervision is required. TS codebase rots much faster then rust/C#/swift etc.
“Without a clear indicator of the author’s intent, any parodic or sarcastic expression of extreme views can be mistaken by some readers for a sincere expression of those views.”
[1] https://dev.to/denyspotapov/porting-is-odd-npm-to-ocaml-usin...
[2] https://en.wikipedia.org/wiki/Poe's_law
Is it that the two extra characters 'an' in 'titan' are so difficult to type for the CLI, or is it just for the giggles when you 'dev'?
But Boa is very very slow compared to JIT compiled JavaScript. As soon as your business logic starts trying to stand up and walk I think you’ll start hitting request latency sadness.
There's a JS to Rust transpiler? How? If this is true, this is the most impressive part. The web server/framework almost irrelevant.
The AI generated documentation is very confusing.
What's about code and DX: it's not a good practice to export anything using globals, this is what JS world refused to do long ago. It turns your code into a hardly debuggable mess quickly
Second, how does concurrency (like promises) translate to rust ?