Decades ago, the TV took five minutes to warm the tubes up before one could watch the news.
Today, the TV takes five minutes to boot, install updates, and mangle some configuration before one (eventually) can watch the news - if the TV has not lost it’s list of stations again.
By the mid 80s and 90s, CRTVs took just seconds to show output on the screen. Even the really old tube TV my grandma had would warm up within seconds.
I once got gifted a TV from a nice elderly guy. The TV had been edge of technology when it was built: It had a wireless remote! Although the remote worked with ultrasound instead of infrared…
This beast took several minutes before it actually showed a picture.
Must’ve been a REALLY old one. I’m old as dirt, and they’ve taken mere seconds all my life. Even fast TVs now take longer to show a picture than the console ones we had when I was a kid, although I did see some from the 50s and 60s that took quite some time.
The original “clicker” remotes were really neat tech! The way it worked unfortunately limited the number of buttons you could have, but still ingenious.
https://www.theverge.com/23810061/zenith-space-command-remote-control-button-of-the-month
I used to have one of those black plastic (or was it Bakelite?) Space-Commander 400 remotes, pictured in the black and white ad.
I was walking home from grade school. Somebody was getting rid of their ancient TV, and had left it on the curb. The boxy, awkwardly shaped remote was in its “holster” on the TV, so I grabbed it and took it home. Before then, I had assumed that only infrared wireless remotes existed.
The idea that a remote could work by ultrasound fascinated me, and the fact that it didn’t even need batteries absolutely blew my little mind.
Let me tell you how shitty they were, and why they probably put this thing to the curb:
The “Receiver” part of that thing was so limited, that it basically interpreted all kinds of ultrasonic sounds as “commands”. Whenever I pulled my curtains open or close, the TV went nuts. It turned off, or it turned the volume to 11, or whatever. I was working on a small piece of metal on my desk, and with every stroke it changed channels, either up or down. This thing was annoying.
I tell my laptop to put the video in the vga port. It does. That’s it. There’s nothing plugged in, but it’s there.
I plug a vga cable in. There’s video in there now. With enough paperclips, I could get it out the other end. My laptop does not care. It wiggles the electrons regardless.
I plug the other end of the cable in. The shielding was eaten by mice and two pins are dead. But alas, lo and behold, purple tho it may be - the video comes out and is displayed.
Meanwhile, hdmi protocol negotiation wants to know if you’d like to set your screen as the default sound device. Not that teams would use it anyway. Actually nevermind, the receiving end doesn’t support the correct copyright protection suite. Get fucked, no video for you.
This is stressing me out :P
Feels like everything is much more a faff to set up, then one bit updates & something or other is longer compatible.
Don’t even want to think about the waste it must generate, both of devices & of the hours trying to get things to work whether at the development end or in the home.
at this point i don’t understand why people bother with TVs rather than just hooking up an actual normal computer to a big screen and just watching youtube or torrenting media
Because you can get much bigger for cheaper compared to a monitor.
deleted by creator
even the monitors are “smart” now. have you seen Samsung’s latest computer monitors?
I even hate how Windows tries to determine which monitors are in use, so if I turn one off it sends everything to the other screen. Just literally be a dumb thing that displays output, please, don’t try to think for me.
Because size.
Tbh was referring to devices generally. I don’t go near televisions today. Too big, too expensive & way too much faff.
Flatscreens: Set it gently on the table, or you might break the screen.
CRTs: Set it gently on the table, or you might break the table.
4K HDR 120 Hertz, that we can easily put up surround sound is great. But fuck, you sneeze wrong and it gets a weird scanline issue, like mine.
Even old flat screens are ridiculously heavy compared to new ones. I replaced an old Sony 720p screen that weight probably 20 pounds with a 1080p smart TV of the same size that I could lift one-handed. And the new one cost less than $200.
I somewhat miss my old Sony 720p screen… it came with a full electronics diagram in case you wanted to repair it.
I grew up with CRTs and VCRs, hard pass. There’s a certain nostalgia to it all: the bum-DOOON sound as its electron gun warmed up, the smell of ozone and tingly sensation that got exponentially stronger the closer you were, crusty visuals… But they were objectively orders of magnitude worse than what we have now, if nothing else than because they don’t weigh 150 pounds or make you wonder if watching Rugrats in Paris for the 30th time on this monster is giving you cancer. Maybe it’s because I’m techie, I’ve never really had much issue with “smart” TVs. Sure, apps will slow down or crash because of memory leaks and it’s not as customizable as I’d like, but I might be satiated just knowing that if push comes to shove I can plug in a spare computer and use it like a monitor for a media system.
I’m rooting it if it starts serving me out-of-band ads, though.
deleted by creator
They don’t seem to have a lecturing tone in their comment. The only part which you might have a point about is where they say “objectively”, but throughout the whole comment they’re really just expressing their opinion and showing their experience with smart TVs, which they’re entitled to have and might be different from yours.
No aggressiveness intended. Just trying to keep the niceness around.
deleted by creator
Thanks for understanding.
I tried using the “smart” of my tv once, it was so slow and laggy i plugged in my 7+ year old Roku and never touched the smart again.
deleted by creator
I feel like I’ve missed something. I don’t dispute any of the horrible experiences people have had, however I’ve had nothing but good luck. The only thing about our current television that bothers me is the promotional wallpapers that get applied every-fucking-time a new Disney property needs advertising. We buy relatively modestly priced units in the $300-$500, so maybe we just have different expectations than someone buying a much more high end unit. It is also possible that it has been pure luck and I’ll reply to this message one day soon to recant everything.
The TVs you buy force you to see ads even when you’re not watching a program and you’re like “I’ve had great luck”?
Some people don’t base their entire personality around hating the existence of ads and jumping through outrageous amounts of steps to avoid them.
So yeah, count me in for a TV that always works how I want it to but had a background ad that I can completely ignore and has no actual bearing on my life.
There are so many more important things for me to spend my time and energy worrying about.
Ah yes, the only two options: complete apathy, or basing your entire personality around hating ads. Did I like hit a nerve or something, or why did you get so defensive over a joke?
Sorry, definitely touched a nerve.
I am really enjoying Lemmy but the vast amount of users that are proponents of piracy/ad blocking without being open to ANY discourse is really frustrating and gets super annoying.
There is a post about Blockbuster phasing out DVD’s. One commenter said since the enshittification of streaming services they and their friends have been buying more physical media so not being able to go to Blockbuster kind of sucks. Some users immediate reaction was to call them out for “being stupid” and providing a link for free streaming (aka piracy) and that “as long as you have ublock then you’re good to go”.
There are people here on Lemmy that will one second scream " work reform! Unionize! Pay me a living wage!" And then immediately turn around and say “Yea, I’m going to do everything in my power to prevent these creators of media from making money. Adblock + PiHole + VPN + Piracy is the only way to use the internet. They don’t deserve my money”
I frequently pirate movies, AFTER I have purchased a physical copy of said movie. Someday I hope that we can use something like the blockchain to tie ownership to digital files regardless of platform but at the moment that’s a pipe dream.
So in the mean time, I will have my ads or I will pay my monthly fee’s, because I want people to be paid for what they created. Even if it’s Disney or Marvel or insert giant company here they created something that didn’t exist before that I enjoy. They deserve to make money off that.
Oh yeah I definitely agree with you about how it’s downright myopic to be against all forms of ads, considering the economic system we find ourselves in. Especially silly are people who both complain about ads on news sites and how news is often very clickbait-y due to having to chase those as views, but who also refuse to pay for news. Same goes for any ad-funded service, really; you’re not getting it for free, you’re “paying with your attention”, ie. ad impressions. The idea that we’re entitled to free content is ridiculously selfish. Sure, I think it sort of sucks that ads are so pervasive and can affect business models negatively, but them’s the breaks when content creators and service providers need to eat too. Ads just happen to be a much easier source of income than subscriptions or let alone voluntary donations (and it’s not like everybody can afford to eg pay for news or whatever.) Doesn’t mean it’s not possible to fund things without ads, but it’s obviously more dependable in many cases than other options, or it wouldn’t be so popular. In previous conversations I’ve had smart-asses say how saying that is supposedly ironic considering I’m on a platform supported by voluntary donations, like the existence of eg. Beehaw means that ads are completely unnecessary in all situations for all services, and everything could just run on voluntary donations or subscriptions. Sure worked out well for news media, didn’t it (the answer to that has been “hur dur mainstream media sucks so why would I pay for it”, totally oblivious to the fact that some of it sucks exactly because nobody wanted to pay for it anymore.)
So yeah I definitely share your irritation at how frankly stupid some people’s attitude towards ads is, and how incredibly entitled it is to think that we’re somehow owed free content and services. Ads may be irritating but as I said, with this economic system this is how it is and we’re just going to have to suck it up. Refusing to expose your delicate sensibilities to any ads out of principle isn’t going to make them go away as a concept, and it definitely isn’t going to pay for goods or services that you now see for free.
However, what – at least to me – feels different about TVs or other such doodads showing you ads is that you already paid for the product and you’re still getting ads. I guess it can be a way to lower the price of the product, but considering how they’re often not all that much cheaper it honestly feels more like just squeezing more money out of the users at the expense of user experience. Ads in otherwise free services do make sense, but this feels more, I dunno, predatory?
deleted by creator
promotional wallpapers
What’s that? I mean, I know what a wallpaper is, but when would a TV display one of those?
deleted by creator
All of those issues are covered by other devices that most people will already have. An XBox360/PS3 or any newer gaming system can output 4k and make the smart features in the TV unnecessary. The same is true of a cable box, Roku plugin, fire stick, or any other streaming device. All the TV really needs is to display the 4k signal it receives. TVs don’t even really need receivers anymore just a USB hub, a processor for video and audio output, and a screen.
deleted by creator
Or grab a mini media focused computer to hook up to it.
I was fine with the quality of old TVs ¯\_(ツ)_/¯
And no thank you, I’m not going to do all that. I don’t care enough about any shows to go through all that hassle. I just want my TV to work without extra expense, and I will complain when it doesn’t because I hate big corporations and I want them to fail.
I was happy with the quality, and don’t get more enjoyment from all the advancements since, but only ever remember plugging it into the wall, plugging an aerial into the back of it & pressing one button to get the tuner to pick up channels. Batteries into the remote once that became a thing. Plug in a VCR or DVD player once they appeared.
No need for a phone line or internet or updates.
I would prefer analog TV, honestly. At least then a partial signal is something.
Descrambler box + skinamax?
625i@50fps blobby
The comparison between CRT and digital is not as simple as “625 vs 4k”. Those analog signals were intended for a triangular subpixel shadow mask with no concept of horizontal pixel count, making them effectively into ∞×625i@50fps signals (1), compared to the digital fixed 3840×2160@60fps square pixels regardless of subpixel arrangement.
It takes about a 6x6 square pixel block to correctly represent a triangular subpixel mask, making 4K LCDs about the minimum required to properly view those 625i@50fps signals.
(1) I’m aware of optics limitations, CoC, quantum effects, and ground TV carrier band limitations, but still.
deleted by creator
I hate smart sutff so much. It’s fucking impossible to find a good TV that don’t have all of this shit thrown in now. I just want a nice display. And that’s it. But what’s worse, is when, not only it comes with awful software but they also take “the Apple route” for their features/services. So you have an issue like this and you can’t do anything about it.
What am I talking about, you say? “The Apple way”, but I allso like to call it the “fucking magic” syndrome. The “fucking magic” syndrome is when something is supposed to be “magic”, to “just work”, BUT WHEN IT DOESN’T… you’re shit out of luck. :)
Because you see, it’s supposed to just work. It’s absolutely inconceivable for it to not just work. So the people who made it and never even for a second considered that it might fail, never took the time to implement some kind of failsafe in the UI to allow you to actually force the thing to do it’s thing on the off chance where the rabbit just refuses to come out of its hat. So when
Anyone who’s had to update their Airpods knows exactly what I’m talking about. They’re supposed to update themselves, without you doing anything. But every now and then… THEY FUCKING DON’T AND THERE IS ABSOLUTELY ZERO WAY TO FORCE THEM TO DO SO! You just have to wait with your Airpods in their cases open for the moon to be in the correct position or something.
Removed by mod
Unfortunately streaming has become the norm, and cable’s no longer affordable
I’d rather go outside if that’s how it’s gonna be
Removed by mod
This is how I do it too. I bought a mini office PC and HDMI’d it to the TV. It’s nothing fancy, just an i5, a wifi card and IHD GPU. I threw Kodi onto it, cancelled every streaming service I was using, and then returned to the high seas to fill up the 10TB external HDD I connected. When a new episode of something drops, I just download it and then Kodi nicely organizes everything.
Removed by mod
There is a Jellyfin addon for Kodi, just saying.
deleted by creator
Removed by mod
deleted by creator
Removed by mod
I liked how you could still watch a station even if it was barely coming in.
This is less an issue of “smartness” and moreso because analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place. HDMI hits kind of a weird spot because it’s a digital protocol based on analog scanlines; if the signal gets disrupted for 0.02 ms, it might only affect the upper half and maybe shift the bits for the lower half. Digital is more contextual and it will resynchronize at least every frame, so this kind of degradation is also unstable.
analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place
Not really. Digital signals come over analog mediums, and it’s up to the receiver to decide how much degradation is too much. Mitigations like error correction are intended to reduce the final errors to zero, but it’s up to the device to decide whether it shows/plays something with some errors, and how many of them, or if it switches to a “signal lost” mode.
For example, compressed digital video has a relatively high level of graceful degradation: full frames come every Nth frame and they are further subdivided into blocks, each block can fail to be decoded on its own without impacting the rest, then intermediate frames only encode block changes, so as long as the decoder manages to locate the header of a key frame, it can show a partial image that gets progressively more garbled until the next key frame. Even if it misses a key frame, it can freeze the output until it manages to locate another one.
Digital audio is more sensitive to non-corrected errors, that can cause high frequency and high volume screeches. Those need more mitigations like filtering to a normalized volume and frequency distribution based on the preceding blocks, but still allow a level of graceful degradation.
deleted by creator