estimated audit backlog: 67560 lines
I started learning rust. Worried about trusting all the various code that gets pulled in from the interwebs to compile the first example project in the book (which depends only on “rand” to get random numbers, which requires 8 different libraries), I installed “cargo vet” so that I’d at least know about it if I accidentally added things that haven’t been vetted by anyone at all.
Doing this installed a further 200 crates, with no indication as to whether they have themselves been vetted by anyone or not, and tells me that half the ones I already had just from adding “rand” have not been vetted by anyone.
Anyway, I’m learning rust.
They may not have been formally vetted, but they are in the sense that the majority of those 200 crates are used widely in everyone else’s projects.
But yeah, this is definitely a blind spot, not just for rust, but all modern build systems that accept code from various sources. At least cargo vet is a step in the right direction.
fastrandhas zero dependencies.And all external dependencies are “pulled from the interwebs” nowadays (in source and/or binary form), irrespective of language. This includes core, alloc, and std, which are crates that came with your compiler, which you pulled from the interwebs.
I got the compiler and whatever comes with it from the debian package manager, which has existed for much longer than has crates.io and has had fewer malicious packages get into it.
So, we established that “pulled in from the interwebs” is not a valid differentiator.
which has existed for much longer than has crates.io
True and irrelevant/invalid (see below). Among the arguments that could be made for <some_distro> packages vs.
crates.io, age is not one of them. And that’s before we get to the validity of such arguments.In this case, it is also an apples-to-oranges comparison, since Debian is a binary distro, and
crates.iois a source package repository. Which one is “better”, if we were to consider this aspect alone, is left for you to ponder.and has had fewer malicious packages get into it.
The xz backdoor was discovered on a Debian Sid system, my friend. Can you point to such “malicious packages” that actually had valid users/dependants on crates.io?
Trusting an organization because it has a long track record of being trustworthy is “invalid”? You guys are pretty weird.
Debian (and other “community” distros) is distributed collaboration, not an organization in the sense you’re describing. You’re trusting a scattered large number of individuals (some anonymous), infrastructure, and processes. The individuals themselves change all the time. The founder of the whole project is not even still with us for example.
Not only the processes did nothing to stop shipping the already mentioned xz backdoor (malicious upstream). But the well-known blasé attitude towards patching upstream code without good reason within some Debian developer circles actually directly caused Debian-only security holes in the past (If you’re young, check this XKCD and the explanation below it). And it just happens that it’s the same blasé attitude that ended up causing the xz backdoor to affect PID 1 (systemd) in the first place. While that particular malicious attack wasn’t effective/applicable in distros that don’t have such an attitude in their “culture” (e.g. Arch).
On the other hand, other Debian developer(s) were the first to put a lot of effort into making reproducible builds a thing. That was a good invaluable contribution.
So there is good, and there is very much some bad. But overall, Debian is nothing special in the world of “traditional” binary distros. But in any case, it’s the stipulation “trusting an organization because it has a long track record of being trustworthy” in the context of Debian that would be weird.
(The “stable distro” model of shipping old patched upstreams itself is problematic, but this comment is too long already.)
crates.iois 10+ years old upstream-submitted repository of language-specific source packages. It’s both not that comparable to a binary distro, and happens to come with no track record of own goals. It can’t come with own goals like the “OpenSSL fiasco” in any case, because the source packages ARE the upstreams. It is also not operated by any anonymous people, which is the first practical requirement to have some logically-coherent trustworthiness into an individual or a group. Most community distros can’t have this as a hard requirement by their own nature, although top developers and infrastructure people tend to be known. But it takes one (intentionally or accidentally) malicious binary packager…You don’t seem to have a coherent picture of a threat model, or actual specific factualities about Debian, or
crates.io, or anything really, in mind. Just regurgitations about “crates.ioBAD” that have been fed mostly by non-techies to non-techies.I think you are completely missing the point. Packages distributed by Debian are less likely to be insecure because Debian policy requires reviewing all source code to make sure it meets interoperability and open-source standards.
Regardless of how frequently this is actually done, if it’s done at all is a point in favor of using Debian distribution. The fact that Debian has introduced errors themselves in a few cases is irrelevant, any developer can do that and crates.io is full of them with not even an attempt at additional review.
You need to balance whether or not the distributor is fixing or introducing more bugs, and in the case of Debian it seems to be overwhelmingly the former.
Your argument that crates.io is a known organization therefore we should trust the packages distributed is undermined by your acknowledgement that crates.io does not produce any code. Instead we are relying on the individual crate developers, who can be as anonymous as they want.
less likely to be insecure
Evidenced by?
requires reviewing all source code
This is exactly the la-la-land view of what distributors do I was dispelling with facts and reality checks. No one is reviewing all source code of anything, except for cases where a distro developer and an upstream member are the same person. And even then, this may not be the case depending on the upstream project, its size, and the distro developer’s role within that project.
to make sure it meets interoperability
Doesn’t mean anything other than “it builds” and “API is not broken” (e.g. withing the same
.soversion), and “seems to work”.These considerations happen to hardly exist with the good tooling provided by cargo.
and open-source standards.
Doesn’t mean anything outside of licensing (for code and assets), and “seems to work”.
Your argument that crates.io is a known organization therefore we should trust the packages distributed is undermined by your acknowledgement that crates.io does not produce any code. Instead we are relying on the individual crate developers, who can be as anonymous as they want.
Largely correct. But that was me comparing middle-man vs. middle-man. That is if
crates.iooperators can be described as middle-men, since their responsibilities (and consequently, attack vectors) are much smaller.Barring organizational attacks from within, with
crates.io, you have one presumably competent/knowledgable, possibly anonymous, source, and operators that don’t do much. With a binary distro, you have that, AND another “middle-man” source, possibly anonymous, and with competence and applicable knowledge <= upstream (charitable), yet put in a position to decide what to do with what upstream provides, or rather, provided… X years ago, if we are talking about the claimed “stable” release channel.The middle man pulls sources from places like
crates.ioanyway. So applying trivial “logic”/“maths”, it can’t be “better”, in the context being discussed.Software doesn’t get depended on out of thin air. You are either first in line directly depending on a library, and thus you would naturally at least make the minimum effort to make sure it’s minimally “fit for purpose”. Or you are an indirect dependant, and thus looking at your direct dependencies, and maybe “trusting” them with the “trickle down”.
More processes, especially automated ones, are always welcome to help catch “stuff” early. But it is no surprise that the “success stories” concern crates with fat ZERO dependants.
Processes that help dependants share their knowledge about their dependencies (a la
cargo vet) are unquestionably good additions. They sure trump the dogmatic blind faith in distros doing something they simply don’t have the knowledge or resources to do, or the slightly less dogmatic faith in some library being “trustable” if packaged by X or XX distros, assuming at least someone knowledgable/competent must have given a thorough look (this has a rough equivalent in the number of dependants anyway).This is all obvious, and doesn’t take much thought from anyone active from the inside (upstreams or distros), instead of the surface “knowledge” that leaks, and possibly gets manipulated, in route to the outside.
You’re correct in your assessment of the worst-case of distro maintainers, however many distro developers/maintainers do contribute to the upstream ( Debian policy explicitly encourages it, I only speak for Debian because that’s the only project I’ve worked in) and do vet and understand the software.
“It can’t be better”. Except distro maintainers can block it from being included if they find errors. As noted above they also often file pull requests against the upstream. This happens a fair amount actually.


