jack-plunder-urbit-comparison-gist

source:: his gist comparing urbit and plunder

mentioned:: [[simple-made-easy-rich-hickey]]

If i’m being realistic, standardizing something as complex and opinionated as an OS, not to mention an application model, doesn’t exactly seem achievable in our current stage of civilization. Rather it is something that has to evolve under competition, and providing a frozen persistent execution environment with a base-level of compatibility seems like the best way to facilitate this evolution.

^d34525

Specifically, you could give them [a central authority, like the UF, or Vaporware] the ability to cordinate entire ecosystem changes, like in a Linux distro or Apple’s App Store. All of your main applications could come from your distro, and the distro audits changes to applications. Applications could also be typechecked against each other to prevent breakages at the interfaces between them. Imagine knowing that two applications send correctly typed values over the correct wires, in the right order. The possibilities here are vast, and Arvo has barely scratched the surface of holistic design!

^f192a8

It is true that in theory, you can build all possible programs using just eval and apply from LISP 1.5. But in practice, the first case of eval would cause significant problems: assoc runs in linear time, meaning that name dereferencing gets slower the more names you have in your environment. Nock’s biggest innovation might be that it reduced this to logarithmic time. By treating names as a UX affordance rather than something that should be semantically important at the bottom layer, it could replace the association list with a tree (“the subject”). All name dereferencing has to be compiled to Nock 0, which runs in logarithmic time. As such, Nock is the first axiomatic computing system to be even remotely practical to use as a foundation for all of computing. But logarithmic time is still logarithmic time, especially so when it comes to something as fundamental as name resolution, an operation that is performed many many times for any program you run. And this construction still fights the way modern hardware works. Plan does away with the environment entirely. If you need a piece of data to be accessible to a piece of code, simply inline it or pass it as an argument. Recall that custom functions are defined using laws {n a b}, where a is the number of arguments. This means that the number of arguments is known for all functions, and the argument list can be stored in a contiguous memory block during execution. This means that dereferencing runs in constant time and works with the hardware instead of against it.