While I do agree that there is some very problematic maintainers that are basically blocking progress just because they are old farts that don’t like change. I do also agree with Linus that immediately running to social media to drum up drama is not the correct solution to getting it fixed.
I am ultimately on the side of the rust maintainers here however there were definitely mistakes on both sides, I am not a fan of the trend over the past few years where anytime something doesn’t go your way you try and drum up as much drama on social media as possible. I am also not a fan of policing how people choose to word things, I don’t think there was anything wrong with the context that cancer was used. They were saying that the code would eventually grow uncontrollably and in a way that was unmanageable which is the literal definition of cancer, a cell that grows in an uncontrolled and unmanageable way.
Regardless of whether or not I agree with them that it would become a problem like that, I don’t see any problem with using the word like that.
There’s also some fault with torvald here, he needs to step up and either say that rust is okay or that it’s not because this wishy-washy game is bullshit and is not helping anybody. He originally accepted it into the kernel but has been letting random maintainers create roadblocks for it he either needs to tell them to back the fuck off and get over it or he needs to get rid of the notion that rust is accepted.
He mentioned turning a technical argument into drama and while I am not anywhere near as knowledgeable at these people I didn’t see much technical debate, I saw a maintainer that clearly said they just hated rust and we’re going to do everything they could to block it and not work with anyone on it. Which doesn’t sound like a very technical based argument to me, there were a couple concerns raised but they appeared to be addressed by multiple people quite thoroughly as there was both misunderstandings and even further potential compromises offered
NACK-ing rust at version 8 of the patchset is kind of a dick move. The 7 others were fine before? Ridiculous that Linus didn’t step in in a definitive way.
Without having looked into it, I find it plausible that it could take several patchsets to come to an assessment of consequences and conclusion. Especially as they happen alongside assessments and discussion. The patchset number may also be largely irrelevant depending on what was changed.
There are definitely legitimate situations where that is the case, but I do not think this is one of them. To quote the reason for the rejection (from here):
I accept that you don’t want to be involved with Rust in the kernel, which is why we offered to maintain the Rust abstraction layer for the DMA coherent allocator as a separate component (which it would be anyways) ourselves.
Which doesn’t help me a bit. Every additional bit that the another language creeps in drastically reduces the maintainability of the kernel as an integrated project. The only reason Linux managed to survive so long is by not having internal boundaries, and adding another language complely breaks this. You might not like my answer, but I will do everything I can do to stop this. This is NOT because I hate Rust. While not my favourite language it’s definitively one of the best new ones and I encourage people to use it for new projects where it fits. I do not want it anywhere near a huge C code base that I need to maintain.
These do not sound like the words of someone who had been on the fence but was finally pushed over to one side by the last patchset in a sequence.
Rustaceans could just wait for unwelcoming C coders to slowly SIGQUIT…
Yep. Sometimes that’s all it takes for progress to stop being hampered.
Progress?
Just curious - when’s the last time you compiled the kernel yourself? Do you remember how long it took? And that was all just C, which - while not exactly fast - is at least an order of magnitude faster to compile than Rust.
I’m seriously concerned that if Linux rally slowly does become predominantly Rust, development will stop, because nobody without access to a server farm will be able compile it in any reasonable amount of time.
Rust would be better suited to a micro kernel, where the core is small and subsystems can be isolated and replaced at run time.
Edit: adding a more modern language isn’t a bad idea, IMHO, I just think it should be something like Zig, which has reasonable compile times and no runtime. Zig’s too young, but by the time it’s considered mature, Rust will either be entrenched, or such a disaster that it’ll be decades before kernel developers consider letting a new language in.
Have you compared the compile times for equivalent kinds of drivers in the Linux kernel?
All the different tests ive seen comparing Rust and C put compile times in the same ballpark. Even if somehow every test is unrepresentative of real-world compile times, I doubt it is “order[s] of magnitude” worse.
I remember watching someone test the performance of host a HTTP webpage and comparing the performance of Zig, Rust w/ C HTTP library, and Rust native. Rust native easily beat them out and was able to handle like 10s of thousands more client connections. While I know this isnt directly relevant to Kernels, the most popular C HTTP library is most likely quite optimized.
Memory related vulnerabilities are consistently in the top reported vulnerabilities. It is a big deal, and no, you can’t just program around it. Everyone makes mistakes, has a bad day, or something on their mind. Moments of human fallibility. Eliminating an entire class of the vulnerabilites while staying competitive with C is a hard task, but entirely worth doing.
Of course compiling something without checks is safe. If that’s your standard, we should write the kernel in JS, Python, Ruby, LUA or any other dynamically typed language since there’s no compilation time.
Progress means I don’t have to read blog posts in order to compile the kernel. Progress means I have a sane toolchain that lets me run, test, debug, manage dependencies, and even distribute my code and artefacts (documentation, compile output, …) easily. Progress means catching many more bugs at compile-time instead of runtime.
You’re throwing the baby out with the bath water with the reductio ad absurdum argument. Rust may very well be less secure than Ada - if so, then does that make it not good enough?
I say it’s not worth trading some improvement in safety for vastly longer compile times and a more cognitively complex - harder - language, which increases the barrier of entry for contributors. If the trade were more safety than C, even if not as good as Rust, but improved compile times and a reasonable comprehensibility for non-experts in the language, that’s a reasonable trade.
I have never written a line of code in Zig, but I can read it and derive a pretty good idea of what the syntax means without a lot of effort. The same cannot be said for Rust.
I guess it doesn’t matter, because apparently software developers will all be replaced by AI pretty soon.
I have never written a line of code in Zig, but I can read it and derive a pretty good idea of what the syntax means without a lot of effort. The same cannot be said for Rust.
That’s you dawg. You probably have a different background, because I can follow zig code, but have no idea what a bunch of stuff means.
See samples
pub fn enqueue(this: *This, value: Child) !void { const node = try this.gpa.create(Node); node.* = .{ .data = value, .next = null }; if (this.end) |end| end.next = node // else this.start = node; this.end = node; }
pub fn enqueue(this: *This, value: Child) !void {
,!void
? It’s important to returnvoid
? Watch outvoid
is being returned? Does that mean that you can write!Child
? And what would that even mean?const node = try this.gpa.create(Node);
what doestry
mean there? There’s nocatch
, noexcept
. Does that mean it just kills the stack and throws the exception until it reaches acatch/except
? If not, why put a try there? Is that an indication that it it can throw?node.* = .{ .data = value, .next = null };
excuse me what? Replace the contents of thenode
object with a new dict/map that has the keys.data
and.next
?if (this.end) |end| end.next = node //
what’s the lambda for? And what’s the//
for ? A forgotten comment or an operator? If it’s to escape newline, why isn’t it a backslash like in other languages?start: ?*Node
. Question pointer? A nullable pointer? But aren’t all pointers nullable? Or does zig make a distinction between zero pointers and nullable pointers?pub fn dequeue(this: *This) ?Child { const start = this.start orelse return null; defer this.gpa.destroy(start);
this.start orelse return null
is this a check for null or a check for 0 or both?However when I read rust the first time, I had quite a good idea of what was going on. Pattern matching and
move
were new, but traits were quite understandable coming from Java with interfaces. So yeah, mileage varies wildly and just because you can read Zig, doesn’t mean the next person can.
Regardless, it’s not like either of us have any pull in the kernel (and probably never will). I fear for the day we let AI start writing kernel code…
Granted, everyone is different. The cognitive load of Rust has been widely written about, though, so I don’t think I’m am outlier.
Regardless, it’s not like either of us have any pull in the kernel (and probably never will). I fear for the day we let AI start writing kernel code…
Absolutely never, in my case. This isn’t what concerns me, though. If Rust is harder than C, then fewer people are going to attempt it. If it takes several hours to compile the kernel on an average desktop computer, even fewer are going to be willing to contribute, and almost nobody who isn’t creating a distribution is ever going to even try to compile their own kernel. It may even dissuade people from trying to start new distributions.
If, if, if. Maybe it seems as if I’m fear-mongering, but as I’ve commented elsewhere, I noticed that when looking for tools in AUR, I’ve started filtering out anything written in Rust unless it’s a -bin. It’s because at some point I noticed that the majority of the time spent upgrading software on my computer was spent compiling Rust packages. Like, I’d start an update, and every time I checked, it’d be in the middle of compiling Rust. And it isn’t because I’m using a lot of Rust software. It has had a noticeable negative impact on the amount of time my computer spends with the CPU pegged upgrading. God forgive me, I’ve actually chosen Node-based solutions over Rust ones just because there was no -bin for the Rust package.
I don’t know if this is the same type of “cancer” in the vitriolic Kernel ML email that led to the second-to-last firestorm, but this is how I’ve started to feel about Rust - if there’s a bin, great! But no source-based packages, because then updating my desktop starts to become a half-day journey. I’m almost to the point of actively going in and replacing the source-based Rust tools with anything else, because it’s turning updating my system into a day-long project.
Haskell is already in this corner. Between the disk space and glacial ghc compile times, I will not install anything Haskell unless it’s pre-compiled. And that’s me having once spent a year in a job writing Haskell - I like the language, but it’s like programming in the 70’s: you write out your code, submit it as a job, and then go do something else for a day. Rust is quickly joining it there, along with Electron apps, which are in the corner for an entirely different reason.
Zig is designed as a successor to C, no? So i assume it does syntax and things quite similarly. Rust is not a C-like language, so i dont think this a fair comparison at all.
But in the end, learning syntax isnt the hard part of a new language (even if it is annoying sometimes).
learning syntax isnt the hard part of a new language
No, it’s not, and that’s worse, not better. Understanding the pitfalls and quirks of the language, the gotchas and dicey areas where things can go wrong - those are the hard parts, and those are only learned through experience. This makes it even worse, because only Rust experts can do proper code reviews.
TBF, every language is like this. C’s probably worse in the foot-gun areas. But the more complex the language, the harder it is for people to get over that barrier of entry, and the fewer that will try. This is a problem of exclusion, and a form of gate keeping that’s designed - unintentionally - into the language.