Did you know most coyotes are illiterate?

Lemmy.ca flavor

  • 4 Posts
  • 57 Comments
Joined 4 个月前
cake
Cake day: 2025年6月7日

help-circle


  • Screen-sharing is part of chat apps nowadays. You’re fully within your rights to stay on IRC and pretend that featureful chat is not the norm these days, but that doesn’t mean society is going to move to IRC with you. Like it or not, encrypted chat apps have to become even more usable for the average person for adoption to go up. This reminds me of how all the old Linux-heads insisted that gaming was for children and that Linux didn’t need gaming. Suddenly now that Linux has gaming, adoption is going way up - what a coincidence.

    Edit: Also for the record, I have a tech-savvy friend who refuses to move to Signal until there are custom emoji reactions, of all things. You can definitely direct your ire towards these people, but the reality is some people have a certain comfort target, and convincing them to settle for less is often harder than improving the app itself.


  • Yeah h264 is the base codec (also known as AVC), x264 is the dominant encoder that encodes in that codec. So the base BDs are just plain h264, and remuxes will take that h264 and put it into an mkv container. Colloquially, people tag WEB-DL and BDs/remuxes as “h264” as they’re raw/untampered-with, and anything that’s been encoded by a person as “x264”. Same thing for h265/HEVC and x265, and same for h266/VVC and x266.


  • Yeah I’m reading a little bit on it, and it seems like apt-get can’t install new packages during an upgrade. On initial reading I was thinking there were specific packages it couldn’t download or something, but this makes sense too. Regardless, this is news to me; I always assumed that apt and apt-get were the same process, just with apt-get having stable text output for awk’ing and apt being human-readable. I’ve been using nala for a long time anyway, but this is very useful knowledge.



  • I’ve just discovered kids toothpaste recently, which doesn’t have this problem. All my life I’ve hated mint so much, and no one ever told me you can have candy as a toothpaste flavor. Just make sure you check the active ingredients for a regular amount of sodium/stannous fluoride; all the rest of the marketing gimmicks of adult toothpaste are mostly meaningless. Also, you should try to not swish water around your mouth after brushing anyway, since you want the fluoride/paste to sit on your teeth for a while.


  • Yep, fully agree. At least BluRays still exist for now. Building a beefy NAS and collecting full BluRay disks allows us to brute force the picture quality through sheer bitrate at least. There are a number of other problems to think about as well before we even get to the encoder stage, such as many (most?) 4k movies/TV shows being mastered in 2k (aka 1080p) and then upscaled to 4k. Not to mention a lot of 2k BluRays are upscaled from 720p! It just goes on and on. As a whole, we’re barely using the capabilities of true 4k in our current day. Most of this UHD/4k “quality” craze is being driven by HDR, which also has its own share of design/cultural problems. The more you dig into all this stuff the worse it gets. 4k is billed as “the last resolution we’ll ever need”, which IMO is probably true, but they don’t tell you that the 4k discs they’re selling you aren’t really 4k.


  • The nice thing is that Linux is always improving and Windows is always in retrograde. The more users Linux has, the faster it will improve. If the current state of Linux is acceptable enough for you as a user, then it should be possible to get your foot in the door and ride the wave upwards. If not, wait for the wave to reach your comfort level. People always say <CURRENT_YEAR> is the year of the Linux desktop but IMO the real year of the Linux desktop was like 4 or 5 years ago now, and hopefully that captured momentum will keep going until critical mass is achieved (optimistically, I think we’re basically already there).


  • To be fair, it’s also basically impossible to have extremely high quality AV1 video, which is what a lot of P2P groups strive for. A lot of effort has gone into trying to do so and results weren’t good enough compared to x264, so it’s been ignored. AV1 is great at compression efficiency, but it can’t make fully transparent encodes (i.e., indistinguishable from the source). It might be different with AV2, though again even if it’s possible it may be ignored because of compatibility instead; groups still use DTS-HD MA over the objectively superior FLAC codec for surround sound because of hardware compatibility to this day. (1.0/2.0 channels they use FLAC because players support that usually)

    As for HEVC/x265, it too is not as good as x264 at very high quality encoding, so it’s also ignored when possible. Basically the breakdown is that 4k encoding uses x265 in order to store HDR and because the big block efficiency of x265 is good enough to compress further than the source material. x264 wouldn’t be used for 4k encoding even if it could store HDR because its compression efficiency is so bad at higher resolutions that to have any sort of quality encode it would end up bigger than the source material. Many people don’t even bother with 4k x265 encodes and just collect the full disc/remuxes instead, because they dislike x265’s encoder quality and don’t deem the size efficiency worth its picture quality impact (pretty picky people here, and I’m not really in that camp).

    For 1080p, x265 is only used when you want to have HDR in a 1080p package, because again x265’s picture quality can’t match x264, but most people deem HDR a bigger advantage. x264 is still the tool of choice for non-HDR 1080p encodes, and that’s not a culture thing, that’s just a quality thing. When you get down into public P2P or random encoding groups it’s anything goes, and x265 1080p encodes get a lot more common because x265 efficiency is pretty great compared to x264, but the very top-end quality just can’t match x264 in the hands of an experienced encoder, so those encoding groups only use x265 when they have to.

    Edit: All that to say, we can’t entirely blame old-head culture or hardware compatibility for the unpopularity of newer formats. I think the home media collector usecase is actually a complete outlier in terms of what these formats are actually being developed for. WEB-DL content favors HEVC and AV1 because it’s very efficient and displays a “good enough” quality picture for their viewers. Physical Blu-Rays don’t have to worry about HDD space or bandwidth and just pump the bitrate insane on HEVC so that the picture quality looks great. For the record, VVC/x266 is already on the shortlist for being junk for the usecases described above (x266 is too new to fully judge), so I wouldn’t hold my breath for AV2 either. If you’re okay with non-transparency, I’d just stick with HEVC WEB-DLs or try to find good encoding groups that target a more opinionated quality:size ratio (some do actually use AV1!). Rules of thumb for WEB-DL quality are here, though it will always vary on a title-by-title basis.


  • It looks likely that Overstreet has upset too many important, influential people, and hurt too many feelings — and as a result, Linux is not going to get a new next-gen copy-on-write filesystem. It’s a significant technological loss, and it’s all down to people not getting along, rather than the shared desire to create a better OS. ®

    I don’t like how this article is framed as if everyone else not tiptoeing around Kent is The Real Problem. He was given clear warnings and way more second chances than he deserved. He was (and still is) unable to follow the rules and control his temper, and everyone decided he’s a lost cause - as is completely logical. Just because you have a cool toy doesn’t mean everyone is forced to be your friend. Go play in your own sandbox until you learn to follow the rules like everyone else. Consider writing a giant apology letter and giving the Linux community the best gift of all: changed behavior.




  • The Ratchet & Clank CPU Limited run has some noticeable FPS dips/loss under NTSYNC that FSYNC doesn’t have. It seems like NTSYNC generally trails or ties FSYNC in most other cases. I didn’t watch every minute of the footage - just skipped around through some of the CPU-limited sections since I imagine that’s the only part that matters. In any case, it seems like there’s not much to gain from using NTSYNC yet; maybe improvements will be made to at least tie FSYNC. My rudimentary (possibly incorrect) understanding is that FSYNC is hacky and that NTSYNC is the “correct” way to do it, so if nothing else getting NTSYNC to tie FSYNC means FSYNC can be deprecated at least.



  • If all you want is efficient web quality I’d highly recommend grabbing the cjpegli binary out of the latest static release from the libjxl repo and using the following command to slim down images with near-zero visual quality loss: cjpegli -q 90 -p 2 "input.jpg" "output.jpg". It uses modern encoding techniques derived from the next-gen JPEG-XL project and stuffs the result back into a regular old JPEG container. Replace “90” with e.g. 90/92/95 depending on the quality level you want to target. After playing around with some of the quality levels and checking the resultant filesizes you should be able to get a feel for what you can reasonably get away with for the resolution and makeup of a particular image. If you still can’t get it small enough, you probably need to start reducing the resolution as well.

    In terms of what size an average image should be for Threadiverse purposes, I’d shoot for 0.5-1MB. If it’s just a meme or something with value not intrinsic to its image quality I’d aim lower, whereas if it’s something OC like photography I’d bump the quality higher (or maybe have a web-quality version available on click with a higher quality version hosted elsewhere).



  • This video has a lot of self-hosting and somewhat advanced stuff being mentioned, but if all you want to do is start dipping your toe into Linux then it’s not nearly as hard as you’d think. I would try running Linux in a VM (i.e. VirtualBox) to get a feel for how it operates and build up confidence that way, as well as maybe watching some videos on how people set up and use their Linux etc. It will be a learning curve, but as long as you pick a beginner-friendly distro (e.g. Linux Mint) it’s really no more difficult than if you started using Windows for the first time. Keep backups of your data and/or put Linux on a secondary computer and you should weather the initial few weeks just fine.

    On the upside, when you have problems in Linux there will be logical solutions with answers that can be searched for, whereas in an OS like Windows or Mac the solution is probably “I dunno! Reinstall?” or “You just can’t do that, sorry”. It’s also understandable if you don’t want to touch anything complicated, but I do think one of the best parts of Linux is really just getting messy, making mistakes, and learning. Because things in Linux make sense, over time you’ll learn how to use a computer again. I feel strongly that Windows/Mac/Android/iPhones have (intentionally) dulled the average person’s computing skills and put them into a state of learned helplessness. Everyone thinks computers are complicated wizardry because nothing on those proprietary operating systems makes logical sense, and trying to troubleshoot anything results in wasted time and frustration.


  • Pretty good video. It’s not like he explains how to do anything or even picks very good software to begin with, but his genuine excitement is really all that’s required. Getting people interested is the important part, and they’ll learn much better by using their own motivation. This video also gives off a strong “I’m an idiot, and if I can do it you can do it” vibe which can be really reassuring to those who are just too intimidated to even dip their toe in.