• CameronDev@programming.dev
    link
    fedilink
    arrow-up
    197
    ·
    7 months ago

    To be fair, we only know of this one. There may well be other open source backdoors floating around with no detection. Was heartbleed really an accident?

    • lemmyreader@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      101
      ·
      7 months ago

      True. And the “given enough eyeballs, all bugs are shallow” is a neat sounding thing from the past when the amount of code lines was not as much as now. Sometimes it is scary to see how long a vulnerability in the Linux kernel had been there for years, “waiting” to be exploited.

      • RecluseRamble@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        75
        ·
        7 months ago

        Still far better than a proprietary kernel made by a tech corp, carried hardly changed from release to release, even fewer people maintain, and if they do they might well be adding a backdoor themselves for their government agency friends.

    • xenoclast@lemmy.world
      link
      fedilink
      arrow-up
      36
      ·
      7 months ago

      Yeah he didn’t find the right unmaintained project. There are many many many cs undergrads starting projects that will become unmaintained pretty soon.

  • Codex@lemmy.world
    link
    fedilink
    arrow-up
    170
    ·
    7 months ago

    I’ve gotten back into tinkering on a little Rust game project, it has about a dozen dependencies on various math and gamedev libraries. When I go to build (just like with npm in my JavaScript projects) cargo needs to download and build just over 200 projects. 3 of them build and run “install scripts” which are just also rust programs. I know this because my anti-virus flagged each of them and I had to allow them through so my little roguelike would build.

    Like, what are we even suppose to tell “normal people” about security? “Yeah, don’t download files from people you don’t trust and never run executables from the web. How do I install this programming utility? Blindly run code from over 300 people and hope none of them wanted to sneak something malicious in there.”

    I don’t want to go back to the days of hand chisling every routine into bare silicon by hand, but i feel l like there must be a better system we just haven’t devised yet.

    • Killing_Spark@feddit.de
      link
      fedilink
      arrow-up
      31
      ·
      7 months ago

      Debian actually started to collect and maintain packages of the most important rust crates. You can use that as a source for cargo

    • wolf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      23
      ·
      7 months ago

      THIS.

      I do not get why people don’t learn from Node/NPM: If your language has no exhaustive standard library the community ends up reinventing the wheel and each real world program has hundreds of dependencies (or thousands).

      Instead of throwing new features at Rust the maintainers should focus on growing a trusted standard library and improve tooling, but that is less fun I assume.

        • wolf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          Easily, just look at the standard libraries of Java/Python and Golang! :-P

          To get one thing out of the way: Each standard library has dark corners with bad APIs and outdated modules. IMHO it is a tradeoff, and from my experience even a bad standard library works better than everyone reinvents their small module. If you want to compare it to human languages: Having no standard library is like agreeing on the English grammar, but everyone mostly makes up their own words, which makes communication challenging.

          My examples of missing items from the Rust standard library (correct me, if I am wrong, not a Rust user for many reasons):

          • Cross platform GUI library (see SWING/Tk)
          • Enough bits to create a server
          • Full set of data structures and algorithms
          • Full set of serialization format processing XML/JSON/YAML/CVS/INI files
          • HTTP(S) server for production with support for letsencrypt etc.

          Things I don’t know about if they are provided by a Rust standard library:

          • Go like communication channels
          • High level parallelism constructs (like Tokyo etc.)

          My point is, to provide good enough defaults in a standard library which everybody knows/are well documented and taught. If someone has special needs, they always can come up with a library. Further, if something in the standard library gets obsolete, it can easily be deprecated.

          • areyouevenreal@lemm.ee
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            Python doesn’t have a production web server in its standard library. Neither does Java. Those are external programs or libraries. C# is the only language I know that comes with an official production grade server, and that’s still a separate package (IIS).

            Rust has a set of recommended data structures in their standard libraries too: https://doc.rust-lang.org/std/collections/index.html

            I don’t know what algorithms you are looking for so can’t answer here.

            The rest I don’t think are included in Rust. Then again they aren’t included in most languages standard libraries.

            • wolf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              7 months ago

              Golangs web server is production grade and used in production. (Of course everyone uses some high performance proxy like NGINX for serving static pages, that’s another story.)

              Technically you are right that java has no production web server, which I don’t like, OTOH Java has standard APIs WebServers and Spring is the defacto standard for web applications. (I totally would not mind to move Spring into the OpenJDK.)

              My point is simple: Instead of having Rust edtion 2020, 2021 etc. and tweaking the syntax ad infinitum, I’d rather have a community which invests in a good/broad standard library and good tooling.

              The only platform widely used in production w/o a big standard library is Node.js/JavaScript, mostly for historical reasons and look at the problems that Node.js has for a decade now because of the missing standard library.

        • Miaou@jlai.lu
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          It does, but the person you reply to apparently expects a standard library to contain an ECS and a rendering engine.

    • RegalPotoo@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      7 months ago

      It’s a really wicked problem to be sure. There is work underway in a bunch of places around different approaches to this; take a look at SBoM (software bill-of-materials) and reproducible builds. Doesn’t totally address the trust issue (the malicious xz releases had good gpg signatures from a trusted contributor), but makes it easier to spot binary tampering.

      • wizzim@infosec.pub
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        7 months ago

        +1

        Shameless plug to the OSS Review Toolkit project (https://oss-review-toolkit.org/ort/) which analyze your package manager, build a dependency tree and generates a SBOM for you. It can also check for vulnerabilitiea with the help of VulnerableCode.

        It is mainly aimed at OSS Compliance though.

        (I am a contributor)

    • acockworkorange@mander.xyz
      link
      fedilink
      arrow-up
      18
      ·
      7 months ago

      Do you really need to download new versions at every build? I thought it was common practice to use the oldest safe version of a dependency that offers the functionality you want. That way your project can run on less up to date systems.

      • baseless_discourse@mander.xyz
        link
        fedilink
        arrow-up
        38
        ·
        edit-2
        7 months ago

        Most softwares do not include detailed security fixes in the change log for people to check; and many of these security fixes are in dependencies, so it is unlikely to be documented by the software available to the end user.

        So most of the time, the safest “oldest safe” version is just the latest version.

        • acockworkorange@mander.xyz
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          7 months ago

          So only protects like Debian do security backports?

          Edit: why the downvote? Is this not something upstream developers do? Security fixes on older releases?

          • Kelly@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 months ago

            Backports for supported versions sure,.

            That’s why there is an incentive to limit support to latest and maybe one previous release, it saves on the backporting burden.

      • treadful@lemmy.zip
        link
        fedilink
        English
        arrow-up
        24
        ·
        7 months ago

        Okay, but are you still going to audit 200 individual dependencies even once?

    • trolololol@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      I’m not familiar with rust but at least for java there’s a owasp plugin that tells you if you’re using an unsafe library.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    111
    ·
    7 months ago

    everytime this happens i become unexplainably happy.

    There’s just something about a community doing it’s fucking job that gets me so normal feeling.

  • hash0772@sh.itjust.works
    link
    fedilink
    arrow-up
    97
    ·
    7 months ago

    Getting noticed because of a 300ms delay at startup by a person that is not a security researcher or even a programmer after doing all that would be depressing honestly.

  • mariusafa@lemmy.sdf.org
    link
    fedilink
    arrow-up
    92
    ·
    7 months ago

    I love free software community. This is one of the things free software was created. The community defends its users.

    • tired_n_bored@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      I second this. I love to feel part of a community even tho I could have never found the backdoor, let alone fix it.

  • Cosmic Cleric@lemmy.world
    link
    fedilink
    arrow-up
    60
    ·
    edit-2
    7 months ago

    The problem I have with this meme post is that it gives a false sense of security, when it should not.

    Open or closed source, human beings have to be very diligent and truly spend the time reviewing others code, even when their project leads are pressuring them to work faster and cut corners.

    This situation was a textbook example of this does not always happen. Granted, duplicity was involved, but still.

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      7 months ago

      100%.

      In many ways, distributed open source software gives more social attack surfaces, because the system itself is designed to be distributed where a lot of people each handle a different responsibility. Almost every open source license includes an explicit disclaimer of a warranty, with some language that says something like this:

      THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.

      Well, bring together enough dependencies, and you’ll see that certain widely distributed software packages depend on the trust of dozens, if not hundreds, of independent maintainers.

      This particular xz vulnerability seems to have affected systemd and sshd, using what was a socially engineered attack on a weak point in the entire dependency chain. And this particular type of social engineering (maintainer burnout, looking for a volunteer to take over) seems to fit more directly into open source culture than closed source/corporate development culture.

      In the closed source world, there might be fewer places to probe for a weak link (socially or technically), which makes certain types of attacks more difficult. In other words, it might truly be the case that closed source software is less vulnerable to certain types of attacks, even if detection/audit/mitigation of those types of attacks is harder for closed source.

      It’s a tradeoff, not a free lunch. I still generally trust open source stuff more, but let’s not pretend it’s literally better in every way.

      • Cosmic Cleric@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        7 months ago

        It’s a tradeoff, not a free lunch. I still generally trust open source stuff more, but let’s not pretend it’s literally better in every way.

        Totally agree.

        All the push back I’m getting is from people who seem to be worried about open source somehow losing a positive talking point, when comparing it to close source systems, which is not my intention (the loss of the talking point). (I personally use Fedora/KDE.)

        But sticking our heads in the sand doesn’t help things, when issues arise, we should acknowledge them and correct them.

        using what was a socially engineered attack on a weak point in the entire dependency chain.

        An example of what you may be speaking about, indirectly. We can only hope that maintainers do due diligence, but it is volunteer work.

      • 5C5C5C@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        7 months ago

        There are two big problems with the point that you’re trying to make:

        1. There are many open source projects being run by organizations with as much (often stronger) governance over commit access as a private corporation would have over its closed source code base. The most widely used projects tend to fall under this category, like Linux, React, Angular, Go, JavaScript, and innumerable others. Governance models for a project are a very reasonable thing to consider when deciding whether to use a dependency for your application or library. There’s a fair argument to be made that the governance model of this xz project should have been flagged sooner, and hopefully this incident will help stir broader awareness for that. But unlike a closed source code base, you can actually know the governance model and commit access model of open source software. When it comes to closed source software you don’t know anything about the company’s hiring practices, background checks, what access they might provide to outsourced agents from other countries who may be compromised, etc.

        2. You’re assuming that 100% of the source code used in a closed source project was developed by that company and according to the company’s governance model, which you assume is a good one. In reality BSD/MIT licensed (and illegally GPL licensed) open source software is being shoved into closed source code bases all the time. The difference with closed source software is that you have no way of knowing that this is the case. For all you know some intern already shoved a compromised xz into some closed source software that you’re using, and since that intern is gone now it will be years before anyone in the company notices that their software has a well known backdoor sitting in it.

        • GamingChairModel@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          None of what I’m saying is unique to the mechanics of open source. It’s just that the open source ecosystem as it currently exists today has different attack surfaces than a closed source ecosystem.

          Governance models for a project are a very reasonable thing to consider when deciding whether to use a dependency for your application or library.

          At a certain point, though, that’s outsourced to trust whoever someone else trusts. When I trust a specific distro (because I’m certainly not rolling my own distro), I’m trusting how they maintain their repos, as well as which packages they include by default. Then, each of those packages has dependencies, which in turn have dependencies. The nature of this kind of trust is that we select people one or two levels deep, and assume that they have vetted the dependencies another one or two levels, all the way down. XZ did something malicious with systemd, which opened a vulnerability in sshd, as compiled for certain distros.

          You’re assuming that 100% of the source code used in a closed source project was developed by that company and according to the company’s governance model, which you assume is a good one.

          Not at all. I’m very aware that some prior hacks by very sophisticated, probably state sponsored attackers have abused the chain of trust in proprietary software dependencies. Stuxnet relied on stolen private keys trusted by Windows for signing hardware drivers. The Solarwinds hack relied on compromising plugins trusted by Microsoft 365.

          But my broader point is that there are simply more independent actors in the open source ecosystem. If a vulnerability takes the form of the weakest link, where compromising any one of the many independent links is enough to gain access, that broadly distributed ecosystem is more vulnerable. If a vulnerability requires chaining different things together so that multiple parts of the ecosystem are compromised, then distributing decisionmaking makes the ecosystem more robust. That’s the tradeoff I’m describing, and making things spread too thin introduces the type of vulnerability that I’m describing.

        • GamingChairModel@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          In the broader context of that thread, I’m inclined to agree with you: The circumstances by which this particular vulnerability was discovered shows that it took a decent amount of luck to catch it, and one can easily imagine a set of circumstances where this vulnerability would’ve slipped by the formal review processes that are applied to updates in these types of packages. And while it would be nice if the billion-dollar-companies that rely on certain packages would provide financial support for the open source projects they use, the question remains on how we should handle it when those corporations don’t. Do we front it ourselves, or just live with the knowledge that our security posture isn’t optimized for safety, because nobody will pay for that improvement?

  • DingoBilly@lemmy.world
    link
    fedilink
    arrow-up
    46
    ·
    7 months ago

    Immediately noticed even though the packages have been out for over a month?

    Easily could have stolen a ton of information in that month.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      ·
      7 months ago

      Yeah but tbf it was deployed on mostly rolling release and beta releases.

      No enterprise on prod is worried because they’re still on RHEL 6 /s

        • vrighter@discuss.tchncs.de
          link
          fedilink
          arrow-up
          8
          ·
          7 months ago

          we’ve skipped 7 and are jumping straight to 8. The process has been going on for two years now. 9 was released 2 years ago

        • mlg@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          My innocent home lab bum thought 4 years would be enough to assume people got off of an EOLd distro lol

      • DingoBilly@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        7 months ago

        Yeah they got lucky. But shows how susceptible systems are. Really makes you wonder how many systems are infected with similar - this wouldn’t be the first back door that’s live in Linux systems.

      • melpomenesclevage@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        7 months ago

        I feel like that’s really crappy non-vegan mental gymnastics. I think veganism is morally superior, but I really want to show mine off, just because I’m offended by how stupid all these are-the fact I know they’re real makes me more ashamed of eating that yogurt earlier than any amount of chatt slavery or butchery ever will.

      • JustEnoughDucks@feddit.nl
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Depends, for example Debian unattended-upgrade caused system restarts after many updates that was extremely inconvenient for me because I have a more manual bringup process. I had restarts turned off in its settings and it still restarted.

        I uninstalled it and have not one single unwanted restart since then, so manual upgrades it is.

        • jabjoe@feddit.uk
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          I’ve been using it for 10+ years on servers and it’s not been an issue for me.

    • Zozano@lemy.lol
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      7 months ago

      For the uninitiated, this is a representation of the Survivorship Bias.

      Essentially, the red dots represent bullet holes from aircraft which returned from battle.

      If you were to ask someone which places should be reinforced with armour, someone who has the Survivorship Bias would say “where the red dots are”, whereas people who know anything about engineering would say “everywhere else!”

      It’s like saying: “why are you wearing a helmet? I’ve met hundreds of soldiers and none of them have ever been shot in the head, helmets are a waste of good armour.”

      A true fact: Did you know wearing a helmet increases your chances of dying of cancer.

      • grrgyle@slrpnk.net
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        A true fact: Did you know wearing a helmet increases your chances of dying of cancer.

        Rofl I love this. Great comment

    • communism@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      What are you saying? That there are people doing the top version (“I want a backdoor / I ask the corpo to grant me access”) for FOSS but they’re less likely to get caught if they don’t do all the gymnastics?

      • sbv@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        OP is referring to a backdoor that was found. It apparently modified behaviour in a way that was noticeable to humans, suggesting that it was built by an unskilled adversary.

        It’s a safe bet that there are others (in FOSS) that remain undiscovered. We know that skilled adversaries can produce pretty amazing attacks (e.g. stuxnet), so it seems likely that similar vulnerabilities remain in other FOSS packages.

        • communism@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          7 months ago

          It’s a safe bet that there are others (in FOSS) that remain undiscovered.

          I agree, but I don’t think that image (about survivors’ bias) applies to the op meme then, as that would imply that it only seems like open source backdoors are convoluted because we’ve not found the simple/obvious ones

          • sbv@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            Survivorship bias or survival bias is the logical error of concentrating on entities that passed a selection process while overlooking those that did not. This can lead to incorrect conclusions because of incomplete data.

            In this case, the selection process is discovering human-evident back doors. It fits by my reading.

    • sus@programming.dev
      link
      fedilink
      arrow-up
      18
      ·
      7 months ago

      because AbstractTransactionAwarePersistenceManagerFactoryProxyBean needs to spin up 32 electron instances (one for each thread) to ensure scalability and robustness and then WelcomeSolutionStrategyExecutor needs to parse 300 megabytes of javascript to facilitate rendering the “welcome” screen

  • Square Singer@feddit.de
    link
    fedilink
    arrow-up
    17
    ·
    7 months ago

    The only real downside on the open source side is that the fix is also public, and thus the recipe how to exploit the backdoor.

    If there’s a massive CVE on a closed source system, you get a super high-level description of the issue and that’s it.

    If there’s one on an open source system, you get ready-made “proof of concepts” on github that any script kiddy can exploit.

    And since not every software can be updated instantly, you are left with millions of vulnerable servers/PCs and a lot of happy script kiddies.

    See, for example, Log4Shell.

    • oce 🐆@jlai.lu
      link
      fedilink
      arrow-up
      82
      ·
      edit-2
      7 months ago

      If your security relies on hidden information then it’s at risk of being broken at any time by someone who will find the information in some way. Open source security is so much stronger because it works independently of system knowledge. See all the open source cryptography that secures the web for example.
      Open source poc and fix increases awareness of issues and helps everyone to make progress. You will also get much more eyes to verify your analysis and fix, as well as people checking if there could other consequences in other systems. Some security specialists are probably going to create techniques to detect this kind of sophisticated attack in the future.
      This doesn’t happen with closed source.
      If some system company/administrator is too lazy to update, the fault is on them, not on the person who made all the information available for your to understand and fix the issue.

      • prettybunnys@sh.itjust.works
        link
        fedilink
        arrow-up
        11
        ·
        7 months ago

        Crowd sourcing vulnerability analysis and detection doesn’t make open source software inherently more secure.

        Closed source software has its place and it isn’t inherently evil or bad.

        This event shows the good and bad of the open source software world but says NOTHING about closed source software.

        • oce 🐆@jlai.lu
          link
          fedilink
          arrow-up
          30
          ·
          7 months ago

          Crowd sourcing vulnerability analysis and detection doesn’t make open source software inherently more secure.

          It does, because many more eyes can find issues, as illustrated by this story.

          Closed source isn’t inherently bad, but it’s worse than open source in many cases including security.

          I think you’re the only one here thinking publishing PoC is bad.

          • prettybunnys@sh.itjust.works
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            7 months ago

            This is literally how I make my living and this is the only comment I’ve made so I’m not sure where you get the idea I think publishing vulnerabilities and PoC are bad … again I literally do this for a living.

            Finding vulnerabilities and reporting them is literally what pays my mortgage. Open Source, Closed Source, they both have their merits but to say one is inherently more secure because of the reasons you’re specifying is tacitly false.

            My comment is literally only about what you said which pushes a thought that slides to far in one direction. There is a reason no nation state will open source their military hardware.

            • oce 🐆@jlai.lu
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              7 months ago

              Then please explain why the reasons specified here are false belong that argument from authority.

              • prettybunnys@sh.itjust.works
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                7 months ago

                I don’t need to repeat myself but that’s all I’d be doing.

                You’re making the argument that open source software inherently does this better and I’m telling you that you’re wrong. I’m going to cite myself, a 20 year veteran in the field.

                It can do it better and often times it does work out this way.

                Closed source software also has value and use and for its own set of reasons could make the argument that it is more secure because of access controls and supply chain management and traditional security mechanisms.

                I think you read what I wrote as a “no you’re entirely wrong” whereas what I said was “you’re asserting things that aren’t true which is weakening the argument”

                Frankly though given the lack of response to what I actually said by anyone I’m just going to rest on knowing in the real world my input is considered valid, here where we’re being fanatics … idk for all you know I’m a bot spewing AI generated drivel.

                Maybe the disconnect here is I’m talking about practical application because of experience vs theoretical application because of ideology.

                • oce 🐆@jlai.lu
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  7 months ago

                  No I don’t think you said I was entirely wrong, that part was clear enough.

                  My issue is more with your argument from authority and personal experience. It is very easy to be biased by personal experience, especially when it brings good money.

                  access controls and supply chain management and traditional security mechanisms.

                  So I’ll put my personal experience too (which is also a low value argument). From the outside it may seem this is well done in big companies. But the reality is that this is often a big mess and security often depends on some guy, if any, actually having some standards and enforcing them, until they leave because the company doesn’t value those tasks. But since it’s closed source, nobody knows about it. With open source, there’s more chance more people will look at this system and find issues.
                  I don’t doubt some ultra sensitive systems like nuclear weapons have a functional closed source security process because the government understands the risk well enough. But I think there are way more closed source systems, at lower danger level but which still impacts people’s security, that are managed with a much lower standard than if they were open-sourced.

            • oce 🐆@jlai.lu
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              That’s a good point, but wasn’t the micro benchmarking possible, published and analyzed because it is open source? Also the vulnerability analysis, impact analysis and fix can be peer reviewed by more yes.

          • summerof69@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            7 months ago

            It does, because many more eyes can find issues, as illustrated by this story.

            This story illustrates that some eyes can find some issues. For proper discussion we need proper data and ratios, only then we could compare. How many issues there are in open and closed source software? How many of them are getting fixed? Unfortunately, we don’t have this data.

            • oce 🐆@jlai.lu
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              I think some of this data is actually available for open source projects by scanning public repositories, although it would be a lot of work to collect it.

        • oce 🐆@jlai.lu
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          7 months ago

          In this case, downgrading to the not affected version. If there’s no possible downgrade, stopping the compromised system until it is fixed.
          Keeping the vulnerable system up because you think nobody else should know is a bet, I don’t think it’s sound. State actors are investing a lot to find and exploit those vulnerabilities, in this case probably even funded the implementation of the vulnerability, so I think you should assume that any vulnerability you discover is already used.

    • SpaceMan9000@lemmy.world
      link
      fedilink
      arrow-up
      44
      ·
      7 months ago

      Honestly, for closed source software the POCs are also immediately available. Lots of threat actors just use patch diffing.

      These days vulnerabilities are at times also patched with other non-related commits to conceal what exactly has changed.

    • ozymandias117@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      7 months ago

      Even in open source, responsible disclosure is generally possible.

      See, e.g. Spectre/Meltdown, where they worked privately with high level Linux Kernel developers for months to have patches ready on all supported branches before they made the vulnerability public

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      7 months ago

      this is why we invented responsible disclosure, which is a thing companies like apple do even. Although in this case, this was the very beginning of what seemed to be a rollout, so if it does effect systems, it’s not very many. And if they are affected. The solution is pretty obvious.

      Don’t be a dunce, report responsibly.

      • Square Singer@feddit.de
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Oh, we play dumb ad-hominem without any basis in reality?

        I can play this too: Was your last school homework hard?

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      I’m pretty sire there are plenty of ways to exploit proprietary systems. You can’t stop the power of the keyboard

    • ris@feddit.de
      link
      fedilink
      arrow-up
      7
      ·
      7 months ago

      In this case it seems the backdoor is only usable with someone who has the correct key. Seeing and reverting something fishy is in some cases, like this easier than finding an exploit. It takes a lot of time in this case to figure out what goes on.

      Fixing a bug never automatically give an easy to use exploit for script kiddies