Developers: I will never ever do that, no one should ever do that, and you should be ashamed for guiding people to. I get that you want to make things easy for end users, but at least exercise some bare minimum common sense.
The worst part is that bun
is just a single binary, so the install script is bloody pointless.
Bonus mildly infuriating is the mere existence of the .sh
TLD.
Edit b/c I’m not going to answer the same goddamned questions 100 times from people who blindly copy/paste the question from StackOverflow into their code/terminal:
WhY iS ThaT woRSe thAn jUst DoWnlOADing a BinAary???
- Downloading the compiled binary from the release page (if you don’t want to build yourself) has been a way to acquire software since shortly after the dawn of time. You already know what you’re getting yourself into
- There are SHA256 checksums of each binary file available in each release on Github. You can confirm the binary was not tampered with by comparing a locally computed checksum to the value in the release’s checksums file.
- Binaries can also be signed (not that signing keys have never leaked, but it’s still one step in the chain of trust)
- The install script they’re telling you to pipe is not hosted on Github. A misconfigured / compromised server can allow a bad actor to tamper with the install script that gets piped directly into your shell. The domain could also lapse and be re-registered by a bad actor to point to a malicious script. Really, there’s lots of things that can go wrong with that.
The point is that it is bad practice to just pipe a script to be directly executed in your shell. Developers should not normalize that bad practice.
It’s bad practice to do it, but it makes it especially easy for end users who already trust both the source and the script.
On the flip side, you can also just download the script from the site without piping it directly to bash if you want to review what it’s going to do before you run it.
It’s bad practice to do it, but it makes it especially easy for end users who already trust both the source and the script.
You’re not wrong but this is what lead to the xz “hack” not to long ago. When it comes to data, trust is a fickle mistress.
Would have been much better if they just pasted the (probably quite short) script into the readme so that I can just paste it into my terminal. I have no issue running commands I can have a quick look at.
I would never blindly pipe a script to be executed on my machine though. That’s just next level “asking to get pwned”.
These scripts are usually longer than that and do some checking of which distro you are running before doing something distro-specific.
Doing something distro-specific in an install script for a single binary seems a bit overcomplicated to me, and definitely not something I want to blindly pipe into my shell.
The bun install script in this post determines what platform you’re on, defines a bunch of logging convenience functions, downloads the latest bun release zip file from GitHub, extracts and manually places the binary in the right spot, then determines what shell you’re using and installs autocompletion scripts.
Like, c’mon. That’s a shitload of unnecessary stuff to ask the user to blindly pipe into their shell, all of which could be avoided by putting a couple sentences into a readme. Bare minimum, that script should just be checked into their git repo and documented in their Readme/user docs, but they shouldn’t encourage anyone to pipe it into their shell.
Installing Rust: curl --proto ‘=https’ --tlsv1.2 -sSf https://sh.rustup.rs | sh (source)
Installing Homebrew: /bin/bash -c “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)” (source)I understand that you find it infuriating, but it’s not something completely uncommon, even in high end projects :/
Common or not, it’s still fucking awful and the people who promote this nonsense should be ashamed of themselves.
Don’t forget Pi-hole! It’s been the default install method since basically the beginning.
Thankfully, I’m using the docker version, which everyone should use.
Yah, when I read this, I was like, pretty sure pi-hole started this as a popular option. I dig it though, so I guess OP and I are not on the same page. (I do usually look over the bash scripts before running them piped to bash, though.
For rust at least, those are packaged in Debian and other distros too. I think rustup is in Debian Trixie too.
--proto ‘=https’ --tlsv1.2
That’s how you know they care, no MIMing that stuff without hijacking the CA at which point you have a whole another set of problems, and if you trust rustc to not delete your sources when they fail a typecheck, then you can trust their installer.
-f
is important to not execute half-downloaded scripts on failure,-s
and-S
are verbosity options,-L
follow redirects.There is even a Windows (Powershell) example for Winutil:
Stable Branch (Recommended)
irm "https://christitus.com/win" | iex
Better than explaining how to make a .ps file trusted for execution (thankfully, one of the few executable file extensions that Windows doesn’t trust by default) but why not just use some basic .exe builder at this point?
Obligatory “they better make it a script that automatically creates a medium for silent Linux Mint installation, modifies the relevant BIOS settings and restarts” to prevent obvious snarky replies
Using a url that’s just some dude’s name makes this so much worse.
He’s reasonably trustworthy. I trust his utility more than Micro$oft but less than Linus Torvalds.
Don’t forget everyone’s favorite massgravel script
I’ve seen a lot of projects doing this lately. Just run this script, I made it so easy!
Please, devs, stop this. There are defined ways to distribute your apps. If it’s local provide a binary, or a flatpak or exe. For docker, provide a docker image with well documented environments, ports, and volumes. I do not want arbitrary scripts that set all this up for me, I want the defined ways to do this.
Would you prefere
$ curl xyz $ chmod +x xyz $ ./xyz
?
You can detect server-side whether curl is piping the script to Bash and running it vs just downloading it, and inject malicious code only in the case no one is viewing it
https://github.com/Stijn-K/curlbash_detect
So that would at least be a minor improvement
In most cases the script already installs a pre-compiled binary that can be anything, they wouldn’t need to make the script itself malicious if they were bad actors.
I mean, how about:
- Download the release for your arch from the releases page.
- Extract to
~/.local/bin
- Run
I think you missed the point.
Why is that safer/better? That binary can do anything a shell script can, and it’s a lot harder to inspect.
- That’s been the way to acquire software since shortly after the dawn of time. You already know what you’re getting yourself into.
- There are SHA256 checksums of each binary file available in each release on Github. You can confirm the binary was not tampered with by comparing a locally computed checksum to the value in the release’s checksums file.
- Binaries can also be signed (not that signing keys have never leaked, but it’s still one step in the chain of trust)
- The install script is not hosted on Github. A misconfigured / compromised server can allow a bad actor to tamper with the install script that gets piped directly into your shell. The domain could also lapse and be re-registered by a bad actor to point to a malicious script. Really, there’s lots of things that can go wrong with that.
The point is that it is bad practice to just pipe a script to be directly executed in your shell. Developers should not normalize that bad practice
If you trust them enough to use their binary, why don’t you trust them enough to run their install scripts as well?
How do you know the script hasnt been compromised? Is every user competent enough to evaluate it to ensure its safe to run?
Using package managers to handle this provides a couple things: First: most package manager have builtin mechanisms to ensure the binary is unmodified Second: they provide a third party validating them.
How do you know the script hasnt been compromised?
You don’t, same as you don’t know if the binary has been compromised, just like when a npm package deleted files for russian users. I get that running scripts from the internet without looking at them first to understand what they do is not secure, but downloading and running anything from the internet is coupled with some amount of risk. How do you know that you won’t be mining crypto currency in addition to the original purpose of the binary? You don’t unless you read the source code.
It all comes down to if you trust the provider or not. Personally, if I trust them enough to run binary files on my computer, I trust them enough to use their scripts for installation. I don’t agree that something is more unsafe just because it is a script.
package manager
Not everything is provided with a package manager, and not everything is up to update with the OS provided package manager. I agree that one should ideally use a package manager with third party validation if that is an option.
- no one is talking about NPM libraries. we’re talking about released packages.
- you absolutely can ensure a binary hasnt been tampered with. its called checksumming.
- you’re confusing MITM attacks with supply chain attacks. MITM attacks are far easier to pull off.
Not everything is provided with a package manager
Yes. thats precisely the problem we’re pointing out to you. if you’re going to provide software over the internet provide a proper package with checksum validation. its not hard, stop providing bash scripts.
Trust and security aren’t just about protecting from malice, but also mistakes.
For example, AUR packages are basically install scripts, and there have been a few that have done crazy things like delete a users /bin — not out of any malice, but rather simple human error.
Binaries are going to be much, much less prone to these mistakes because they are in languages the creators have more experience with, and are comfortable in. Just because I trust someone to write code that runs on my computer, doesn’t mean I trust them to write an install script, especially given how many footguns bash has.
I agree but hey at least you can inspect the script before running it, in contrast to every binary installer you’re called to download.
What’s that? A connection problem? Ah, it’s already running the part that it did get… Oops right on the boundary of
rm -rf /thing/that/got/cut/off
. I’m angry now. I expected the script maintainer to keep in mind that their script could be cut off at litterally any point… (Now what is thatset -e
the maintainer keeps yapping about?)Can you really expect maintainers to keep network error in mind when writing a Bash script?? I’ll just download your script first like I would your binary. Opening yourself up to more issues like this is just plain dumb.
I’ll do it if it’s hosted on Github and I can look at the code first but if it’s proprietary? Heck no
I’m gonna go out on a limb and say you find this more than mildly infuriating.
I assume your concern is with security, so then whats the difference between running the install script from the internet and downloading a binary from the internet and running it?
To add to OP’s concerns, the server can detect if you run
curl <URL> | sh
rather than just downloading the file, and deliver a malicious payload only in the piped to sh case where no one is viewing itYou’re already installing a binary from them, the trust on both the authors and the delivery method is already there.
If you don’t trust, then don’t install their binaries.
You aren’t just trusting the authors though. You’re trusting that no other step in the chain has been tampered with or compromised somehow.
See post edit. I’ve already answered that twice.
You are being irrational about this.
You’re absolutely correct that it is bad practice, however, 98% of people already follow bad practice out of convenience. All the points you mentioned against “DoWnlOADing a BinAary” are true, but it’s simply what people do and already don’t care about.
You can offer only your way of installing and people will complain about the inconvenience of it. Especially if there’s another similar project that does offer the more convenient way.
The only thing you can rationally recommend is to not make the install script the “recommended” way, and recommend they download the binaries from the source code page and verify checksums. But most people won’t care and use the install script anyway.
If the install script were “bloody pointless”, it would not exist. Most people don’t know their architecture, the script selects it for them. Most people don’t know what “adding to path” means, this script does it for them. Most people don’t know how to install shell completions, this script does it for them.
You massively overestimate the average competence of software developers and how much they care. Now, a project can try to educate them and lose potential users, or a project can follow user behavior. It’s not entirely wrong to follow user behavior and offer the better alternatives to competent people, which this project does. It explains that it’s possible and how to download the release from the Github page.
Can you actually explain what concerns you have, that wouldnt be any more of a concern if you downloaded and installed a binary directly?
At least a shell script you can read in plaintext, a binary can just do who the fuck knows what.
If they expected you to read the install script, they’d tell you to download and run it. It’s presented here for lazy people in a “trust me, bro, nothing could ever go wrong” form.
-
There are SHA256 checksums of each binary file available in each release on Github. You can confirm the binary was not tampered with by comparing a locally computed checksum to the value in the release’s checksums file.
-
Binaries can also be signed (not that signing keys have never leaked, but it’s still one step in the chain of trust)
-
The install script is not hosted on Github. A misconfigured / compromised server can allow a bad actor to tamper with the install script that gets piped directly into your shell. The domain could also lapse and be re-registered by a bad actor to point to a malicious script. Really, there’s lots of things that can go wrong with that.
I’ve gone through and responded to the other top level comments as well, but another massive issue you could add to your edit is that servers can detect
curl <URL> | sh
rather than justcurl <URL>
and deliver a malicious payload only if it’s being piped directly to a shell.There’s a proof-of-concept attack showing its efficacy here: https://github.com/Stijn-K/curlbash_detect
-
tbf, every time you’re installing basically anything at all, you basically trust whoever hosts the stuff that they don’t temper with it. you’re already putting a lot of faith out there, and i’m sure a lot of the software actually contains crypto-mineware or something else.
I wouldn’t call anyone who does this, a developer. No offense, but its a horrible practice, that usually come from hacky projects.
I’ll die on the hill that curl | bash is fine if you’re installing software that self updates - very common for package managers like other comments already illustrated.
If you don’t trust the authors, don’t install it (duh).
If you don’t trust the authors, don’t install it (duh).
Just because I trust the authors to write good rust/javascript/etc code, doesn’t mean I trust them to write good bash, especially given how many footguns bash has.
Steam once deleted a users home directory.
But: I do agree with you. I think
curl | bash
is reasonable for package managers like nix or brew. And then once those are installed, it’s better to get software like the Bun OP mentions from them, rather than fromcurl | bash
.
What’s a good package manager right now for stuff like this if i don’t want to use the distro package manager though? I want up to date versions of these tools, ideally shipped by the devs themselves, with easy removal and updates. Is there any right now? I think Homebrew is like that? But I wish it didn’t need creating an entire new user and worked on a user account basis.
In an ideal world, i would want to use these tools in such a way that I can uninstall them, including any tool data (cache, config, etc), and update them in a reliable manner. Most of these tools are also hellbent on creating a new “.<tool-name>” folder or file in the home folder ignoring the XDG spec.
Nix. I use it for everything, including all of my tools I use on my work MacBook.
There are many ways to use nix for this stuff, but personally I use home-manager in a flake-based setup. Versions of tools are all pinned in a lockfile which is committed to source control, so it’s easy to get my config and all my tools on a new machine without any breakage (it does require installing first, though).
It’s a great tool and has largely solved the pain of dealing with having to work on MacOS, for me.
It says in the comment of the script:
npm install
npm is JS-specific
if i don’t want to use the distro package manager
I’m stunned you don’t understand why this is a problem.
This was absolutely trivial stuff before the great Y2K layoffs, so if you can’t figure it out, ask someone who was releasing software professionally back then.
And please, if you learn something from this, try to help others.
I don’t want to use a distro package manager for certain software because nearly every distro except Arch requires adding third party repositories which can stop getting updates at any second.
Don’t worry, I understand the intricacies of these problems a lot more deeply than you probably realise. As a developer, it can suck when your “hotfix” cools down by the time a distro gets around to packaging it. And as a packager, you’re human in the end. As a user though, you just want stuff to work.
As a longtime Linux user, this isn’t really a problem for me, none of this is. But what about a new user? We need to address these issues at some point if we want Linux to be truly user-friendly.