The main issue I have with rust is the lack of a rust abi for shared libraries, which makes big dependencies shitty to work with. Another is a lot of the big, nearly ubiquitous libraries don’t have great documentation, what’s getting put up on crates.io is insufficient to quickly get an understanding of the library. It’d also be nice if the error messages coming out of rust analyzer were as verbose as what the compiler will give you. Other than that it’s a really interesting language with a lot of great ideas. The iterator paradigm is really convenient, and the way enums work leads to really expressive code.
As someone that have worked in software for 30 years, and deplying complicated software, shared libraries is a misstake. You think you get the benefit of size and easy security upgrades, but due to deployment hell you end up using docker and now your deployment actually added a whole OS in size and you need to do security upgrades for this OS instead of just your application.
I use rust for some software now, and I build it with musl, and is struck by how small things get in relation to the regular deployment, and it feels like magic that I no longer get glibc incompatibility issues.
Technically this is conflating two things: bundling dependencies and static/dynamic linking. But since you have to bundle your dependencies to use static linking, and there’s little point dynamic linking if you bundle your dependencies… most of the time they are synonymous.
Exceptions are things like plugins, but that’s pretty rare.
Maybe for your use cases that’s OK, but there are many situations where the size and ease of upgrading provided by shared libraries is worthwhile. For example it would suck to need to push a 40+ GB binary to a fleet of systems with a poor or unreliable internet connection. You could try to mitigate this sort of thing by splitting the application up into microservices, but that adds complexity, and isn’t always a viable tradeoff if maximizing compute efficiency is also a concern.
I’m not so sure that dynamic libraries always reduces the size. Specially with libraries that are linked by a single binary.
With static libraries, you can conditionally compile only the features you’re gonna use. With dynamic libraries, however, the whole library must be compiled.
EDIT: just to clarify, I’m not saying that static libraries result always in less size. I’m saying that it’s not a black and white issue.
My friend from university sends me his Rust code snippets sometimes. Ngl it looks like a pretty cool language.
There was also that tldr reimplemention in Rust that is a gatrillion times faster than the original.
I really want to give it a try but I have executive dysfunction and don’t have any ideas of what I could use it for.
The main issue I have with rust is the lack of a rust abi for shared libraries, which makes big dependencies shitty to work with. Another is a lot of the big, nearly ubiquitous libraries don’t have great documentation, what’s getting put up on crates.io is insufficient to quickly get an understanding of the library. It’d also be nice if the error messages coming out of rust analyzer were as verbose as what the compiler will give you. Other than that it’s a really interesting language with a lot of great ideas. The iterator paradigm is really convenient, and the way enums work leads to really expressive code.
As someone that have worked in software for 30 years, and deplying complicated software, shared libraries is a misstake. You think you get the benefit of size and easy security upgrades, but due to deployment hell you end up using docker and now your deployment actually added a whole OS in size and you need to do security upgrades for this OS instead of just your application. I use rust for some software now, and I build it with musl, and is struck by how small things get in relation to the regular deployment, and it feels like magic that I no longer get glibc incompatibility issues.
Maybe tackle that deployment hell instead of band-aiding it with docker?
He is. By using statically linked binaries.
Technically this is conflating two things: bundling dependencies and static/dynamic linking. But since you have to bundle your dependencies to use static linking, and there’s little point dynamic linking if you bundle your dependencies… most of the time they are synonymous.
Exceptions are things like plugins, but that’s pretty rare.
Maybe for your use cases that’s OK, but there are many situations where the size and ease of upgrading provided by shared libraries is worthwhile. For example it would suck to need to push a 40+ GB binary to a fleet of systems with a poor or unreliable internet connection. You could try to mitigate this sort of thing by splitting the application up into microservices, but that adds complexity, and isn’t always a viable tradeoff if maximizing compute efficiency is also a concern.
I’m not so sure that dynamic libraries always reduces the size. Specially with libraries that are linked by a single binary.
With static libraries, you can conditionally compile only the features you’re gonna use. With dynamic libraries, however, the whole library must be compiled.
EDIT: just to clarify, I’m not saying that static libraries result always in less size. I’m saying that it’s not a black and white issue.