curl https://some-url/ | sh

I see this all over the place nowadays, even in communities that, I would think, should be security conscious. How is that safe? What’s stopping the downloaded script from wiping my home directory? If you use this, how can you feel comfortable?

I understand that we have the same problems with the installed application, even if it was downloaded and installed manually. But I feel the bar for making a mistake in a shell script is much lower than in whatever language the main application is written. Don’t we have something better than “sh” for this? Something with less power to do harm?

  • WolfLink@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    3 hours ago

    It isn’t more dangerous than running a binary downloaded from them by any other means. It isn’t more dangerous than downloaded installer programs common with Windows.

    TBH macOS has had the more secure idea of by default using sandboxes applications downloaded directly without any sort of installer. Linux is starting to head in that direction now with things like Flatpak.

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    20
    ·
    13 hours ago

    What’s stopping the downloaded script from wiping my home directory?

    What’s stopping any Makefile, build script, or executable from running rm -rf ~? The correct answer is “nothing”. PPAs are similarly open, things are a little safer if you only use your distro’s default package sources, but it’s always possible that a program will want to be able to delete something in your home directory, so it always has permission.

    Containerized apps are the only way around this, where they get their own home directory.

    • easily3667@lemmus.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 hours ago

      Don’t forget your package manager, running someone’s installer as root

      It’s roughly the same state as when windows vista rolled out UAC in 2007 and everything still required admin rights because that’s just how everything worked…but unlike Microsoft, Linux distros never did the thing of splitting off installs into admin vs unprivileged user installers.

    • moonpiedumplings@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      6 minutes ago

      Docker doesn’t do this anymore. Their install script got moved to “only do this for testing”.

      Use a convenience script. Only recommended for testing and development environments.

      Now, their install page recommends packages/repos first, and then a manual install of the binaries second.

    • easily3667@lemmus.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 hours ago

      To be fair that’s because Linux funnels you to the safeguard-free terminal where it’s much harder to visualize what’s going on and fewer checks to make sure you’re doing what you mean to be doing. I know it’s been a trend for a long time where software devs think they are immune from mistakes but…they aren’t. And nor is anyone else.

  • thomask@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    20 hours ago

    The security concerns are often overblown. The bigger problem for me is I don’t know what kind of mess it’s going to make or whether I can undo it. If it’s a .deb or even a tarball to extract in /usr/local then I know how to uninstall.

    I will still use them sometimes but for things I know and understand - e.g. rustup will put things in ~/.rustup and update the PATH in my shell profile and because I know that’s what it does I’m happy to use the automation on a new system.

      • thomask@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        15 hours ago

        So tell me: if I download and run a bash script over https, or a .deb file over https and then install it, why is the former a “security nightmare” and the latter not?

        • jagged_circle@feddit.nl
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          Both are a security nightmare, if you’re not verifying the signature.

          You should verify the signature of all things you download before running it. Be it a bash script or a .deb file or a .AppImage or to-be-compiled sourcecode.

          Best thing is to just use your Repo’s package manager. Apt will not run anything that isn’t properly signed by a package team members release PGP key.

      • billwashere@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        21 hours ago

        Yeah I guess if they were being especially nefarious they could supply two different scripts based on user-agent. But I meant what you said anyways… :) I download and then read through the script. I know this is a common thing and people are wary of doing it, but has anyone ever heard of there being something disreputable in one of this scripts? I personally haven’t yet.

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 hours ago

          I’ve seen it many times. It usually takes the form of fake websites that are impersonating the real thing. It is easy to manipulate Google results. Also, there have been a few cases where a bad design and a typo result in data loss.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    20 hours ago

    When I modded some subreddits I had an automod rule that would target curl-bash pipes in comments and posts, and remove them. I took a fair bit of heat over that, but I wasn’t backing down.

    I had a lot of respect for Tteck and had a couple discussions with him about that and why I was doing that. I saw that eventually he put a notice up that pretty much said what I did about understanding what a script does, and how the URL you use can be pointed to something else entirely long after the commandline is posted.

  • zygo_histo_morpheus@programming.dev
    link
    fedilink
    arrow-up
    82
    ·
    2 days ago

    You have the option of piping it into a file instead, inspecting that file for yourself and then running it, or running it in some sandboxed environment. Ultimately though, if you are downloading software over the internet you have to place a certain amount of trust in the person your downloading the software from. Even if you’re absolutely sure that the download script doesn’t wipe your home directory, you’re going to have to run the program at some point and it could just as easily wipe your home directory at that point instead.

    • cschreib@programming.devOP
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      Indeed, looking at the content of the script before running it is what I do if there is no alternative. But some of these scripts are awfully complex, and manually parsing the odd bash stuff is a pain, when all I want to know is : 1) what URL are you downloading stuff from? 2) where are you going to install the stuff?

      As for running the program, I would trust it more than a random deployment script. People usually place more emphasis on testing the former, not so much the latter.

  • lemmeBe@sh.itjust.works
    link
    fedilink
    arrow-up
    23
    ·
    1 day ago

    I think safer approach is to:

    1. Download the script first, review its contents, and then execute.
    2. Ensure the URL uses HTTPS to reduce the risk of man-in-the-middle attacks
  • knexcar@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    20 hours ago

    What does curl even do? Unstraighten? Seems like any other command I’d blindly paste from an internet thread into a terminal window to try to get something on Linux to work.

    • irelephant [he/him]🍭@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 hours ago

      curl sends requests, curl lemmy.world would return the html of lemmy.worlds homepage. piping it into bash means that you are fetching a shell script, and running it.

      • easily3667@lemmus.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 hours ago

        I think he knows but is commenting on the pathetic state of security culture on Linux. (“Linux is secure so I can do anything without concerns”)

        • What URLs is it not a client for? As far as I understand it will pull whatever data is presented by whatever URL. cURL doesn’t really care about protocol being http, you can use it with FTP as well, and I haven’t tested it yet but now that I’m curious I wanna see if it works for SMB