260

curl https://some-url | sh

I see this all over the place nowadays, even in communities that, I would think, should be security conscious. How is that safe? What's stopping the downloaded script from wiping my home directory? If you use this, how can you feel comfortable?

I understand that we have the same problems with the installed application, even if it was downloaded and installed manually. But I feel the bar for making a mistake in a shell script is much lower than in whatever language the main application is written. Don't we have something better than "sh" for this? Something with less power to do harm?

(page 2) 50 comments
sorted by: hot top controversial new old
[-] billwashere@lemmy.world 19 points 1 day ago
[-] possiblylinux127@lemmy.zip 7 points 1 day ago

Download it and then read it. Curl has a different user agent than web browsers.

[-] billwashere@lemmy.world 4 points 1 day ago

Yeah I guess if they were being especially nefarious they could supply two different scripts based on user-agent. But I meant what you said anyways… :) I download and then read through the script. I know this is a common thing and people are wary of doing it, but has anyone ever heard of there being something disreputable in one of this scripts? I personally haven’t yet.

[-] possiblylinux127@lemmy.zip 2 points 1 day ago

I've seen it many times. It usually takes the form of fake websites that are impersonating the real thing. It is easy to manipulate Google results. Also, there have been a few cases where a bad design and a typo result in data loss.

[-] ikidd@lemmy.world 4 points 23 hours ago

When I modded some subreddits I had an automod rule that would target curl-bash pipes in comments and posts, and remove them. I took a fair bit of heat over that, but I wasn't backing down.

I had a lot of respect for Tteck and had a couple discussions with him about that and why I was doing that. I saw that eventually he put a notice up that pretty much said what I did about understanding what a script does, and how the URL you use can be pointed to something else entirely long after the commandline is posted.

You have the option of piping it into a file instead, inspecting that file for yourself and then running it, or running it in some sandboxed environment. Ultimately though, if you are downloading software over the internet you have to place a certain amount of trust in the person your downloading the software from. Even if you're absolutely sure that the download script doesn't wipe your home directory, you're going to have to run the program at some point and it could just as easily wipe your home directory at that point instead.

[-] HelloRoot@lemy.lol 29 points 1 day ago

All the software I have is downloaded from the internet...

[-] strongarm@lemmy.dbzer0.com 1 points 18 hours ago

Steady on Buck Rogers, what is this, 2025!?

[-] WhatAmLemmy@lemmy.world 26 points 1 day ago* (last edited 1 day ago)

You should try downloading the software from your mind brain, like us elite hackers do it. Just dump the binary from memory into a txt file and exe that shit, playa!

You should start getting it from CD-roms, that shit you can trust

[-] veroxii@aussie.zone 25 points 1 day ago

I got my software from these free USB sticks I found in the parking lot.

[-] FauxLiving@lemmy.world 7 points 1 day ago

Ah, you're one of my users

[-] Ephera@lemmy.ml 4 points 1 day ago

It is kind of cool, when you've actually written your own software and use that. But realistically, I'm still getting the compiler from the internet...

[-] cschreib@programming.dev 7 points 1 day ago

Indeed, looking at the content of the script before running it is what I do if there is no alternative. But some of these scripts are awfully complex, and manually parsing the odd bash stuff is a pain, when all I want to know is : 1) what URL are you downloading stuff from? 2) where are you going to install the stuff?

As for running the program, I would trust it more than a random deployment script. People usually place more emphasis on testing the former, not so much the latter.

load more comments (10 replies)
[-] lemmeBe@sh.itjust.works 24 points 1 day ago

I think safer approach is to:

  1. Download the script first, review its contents, and then execute.
  2. Ensure the URL uses HTTPS to reduce the risk of man-in-the-middle attacks
[-] jagged_circle@feddit.nl 3 points 20 hours ago

Key being reduce. Https doesn't protect from loads of attacks. Best to verify the sig.

If its not signed, open a bug report

[-] possiblylinux127@lemmy.zip 5 points 1 day ago

Install scripts are bad in general. ideally use officially packaged software.

load more comments (10 replies)
[-] stevedice@sh.itjust.works 14 points 1 day ago

If you've downloaded and audited the script, there's no reason to pipe it from curl to sh, just run it. No https necessary.

load more comments (4 replies)
load more comments (2 replies)
[-] knexcar@lemmy.world 3 points 23 hours ago

What does curl even do? Unstraighten? Seems like any other command I’d blindly paste from an internet thread into a terminal window to try to get something on Linux to work.

load more comments (6 replies)
[-] serenissi@lemmy.world 15 points 1 day ago

Unpopular opinion, these are handy for quickly installing in a new vm or container (usually throwaway) where one don't have to think much unless the script breaks. People don't install thing on host or production multiple times, so anything installed there is usually vetted and most of the times from trusted sources like distro repos.

For normal threat model, it is not much different from downloading compiled binary from somewhere other than well trusted repos. Windows software ecosystem is famously infamous for exactly the same but it sticks around still.

[-] jagged_circle@feddit.nl 1 points 20 hours ago

Yeah and windows is famous for botnets lol.

load more comments (3 replies)
[-] Undaunted@feddit.org 20 points 1 day ago

You shouldn't install software from someone you don't trust anyway because even if the installation process is save, the software itself can do whatever it has permission to.

"So if you trust their software, why not their install script?" you might ask. Well, it is detectable on server side, if you download the script or pipe it into a shell. So even if the vendor it trustworthy, there could be a malicious middle man, that gives you the original and harmless script, when you download it, and serves you a malicious one when you pipe it into your shell.

And I think this is not obvious and very scary.

[-] August27th@lemmy.ca 8 points 1 day ago

it is detectable [...] server side, if you download the script [vs] pipe it into a shell

I presume you mean if you download the script in a browser, vs using curl to retrieve it, where presumably you are piping it to a shell. Because yeah, the user agent is going to reveal which tool downloaded it, of course. You can use curl to simply retrieve the file without executing it though.

Or are you suggesting that curl makes something different in its request to the server for the file, depending on whether it is saving the file to disk vs streaming it to a pipe?

[-] Undaunted@feddit.org 11 points 1 day ago

It is actually a passive detection based of the timing of the chunk requests. Because curl by default will only request new chunks when the buffer is freed by the shell executing the given commands. This then can be used to detect that someone is not merely downloading but simultaneously executing it. Here's a writeup about it:

https://web.archive.org/web/20250209133823/https://www.idontplaydarts.com/2016/04/detecting-curl-pipe-bash-server-side/

You can also find some proof-of-concept implementations online to try it out yourself.

load more comments (2 replies)
load more comments (2 replies)
[-] MangoPenguin@lemmy.blahaj.zone 17 points 1 day ago* (last edited 1 day ago)

It's not much different from downloading and compiling source code, in terms of risk. A typo in the code could easily wipe home or something like that.

Obviously the package manager repo for your distro is the best option because there's another layer of checking (in theory), but very often things aren't in the repos.

The solution really is just backups and snapshots, there are a million ways to lose files or corrupt them.

load more comments (4 replies)
[-] syklemil@discuss.tchncs.de 27 points 1 day ago

This is simpler than the download, ./configure, make, make install steps we had some decades ago, but not all that different in that you wind up with arbitrary, unmanaged stuff.

Preferably use the distro native packages, or else their build system if it's easily available (e.g. AUR in Arch)

[-] Lemmchen@feddit.org 21 points 1 day ago
load more comments (1 replies)
[-] onlinepersona@programming.dev 11 points 1 day ago

Those just don't get installed. I refuse to install stuff that way. It's to reminiscent of installing stuff on windows. "Pssst, hey bud, want to run this totally safe executable on your PC? It won't do anything bad. Pinky promise". Ain't happening.

The only exception I make is for nix on non-nixos machines because thwt bootstraps everything and I've read that script a few times.

Anti Commercial-AI license

[-] 30p87@feddit.org 15 points 1 day ago

Well yeah ... the native package manager. Has the bonus of the installed files being tracked.

load more comments (3 replies)
[-] Nomecks@lemmy.ca 7 points 1 day ago

curl | sudo bash Gang

load more comments
view more: ‹ prev next ›
this post was submitted on 13 Mar 2025
260 points (96.4% liked)

Linux

6450 readers
537 users here now

A community for everything relating to the GNU/Linux operating system

Also check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS