I absolutely hate “smart” TVs! You can’t even buy a quality “dumb” panel anymore. I can’t convince the rest of my family and friends that the only things those smarts bring are built-in obsolescence, ads, and privacy issues.
I make it a point to NEVER connect my new 2022 LG C2 to the Internet, as any possible improvements from firmware updates will be overshadowed by garbage like ads in the UI, removal of existing features (warning: reddit link), privacy violations, possible attack vectors, non-existent security, and constant data breaches of the manufacturers that threaten to expose every bit of personal data that they suck up. Not to mention increased sluggishness after tons of unwanted “improvements” are stuffed into it over the years, as the chipset ages and can no longer cope.
I’d much rather spend a tenth of the price of my TV on a streaming box (Roku, Shield TV, etc.) and replace those after similar things happen to them in a few years. For example, the display of my OG 32-inch Sony Google TV from 2010 ($500) still works fine, but the OS has long been abandoned by both Sony and Google, and since 2015-16 even the basic things like YouTube and Chrome apps don’t work anymore. Thank goodness I can set the HDMI port as default start-up, so I don’t ever need to see the TV’s native UI, and a new Roku Streaming Stick ($45) does just fine on this 720p panel. Plus, I’m not locked into the Roku ecosystem. If they begin (continue?) enshitifying their products, there are tons of other options available at similar price.
Most people don’t replace their TVs every couple of years. Hell, my decade old 60-inch Sharp Aquos 1080p LCD TV that I bought for $2200 back in 2011 still works fine, and I only had to replace the streamer that’s been driving it twice during all this time. Sony Google TV Box -> Nvidia Shield TV 2015 -> Nvidia Shield TV 2019. I plan to keep it in my basement until it dies completely before replacing it. The Shield TV goes to the LG C2 so that I never have to see LG’s craptastic UI.
Sorry, just felt the need to vent. Would be very interested in reading community’s opinions on this topic.
You actually can buy quality dumb TVs, but you have to do the legwork and do research on what are often referred to as “commercial displays.” I see them everywhere in businesses for ads and showing the menu. They’re sometimes a little pricier, but they’re usually built a little “beefier” too, as they’re expected to deal with more rough usage in like a restaurant context.
However, the other solution is the one you’ve already mentioned where you never plug the Smart TV into the internet, and instead bypass the “smart” on the TV with your own streaming boxes.
I think as more people realize there is a market for dumb TVs, you’ll start to see that market grow more and more until they no longer just “commercial displays.” Just gotta get enough people buying them and not buying Smart TVs.
I think if enough people never gave them Internet access, the manufacturers would start adding in cellular modems to ensure they get the data flowing (that is, data on your viewing habits and sending you ads).
Having worked in this field, I can tell you how it usually operates: You want the most data for the least amount of investment. As soon as your operational costs start to eat into your already thin margins, the equation falls apart.
Complex solutions designed to capture data from that 1-3% of users who actively avoid it end up costing a lot more money than their data is actually worth. In order to make this particular solution work, you need to make enough money selling whatever tiny amount of data you get from those 1-3% of users to cover the cost of putting a cellular modem in all of your TVs plus the ongoing cost of paying various regional cellular networks to deliver that data to you. You are likely tripling or quadrupling the total cost of your data collection operation and all you have to show for it is a rounding error. And that is before we factor in the fact that these users likely aren’t using the built in streaming apps, so the quality of the data you get from them is below average.
The cheaper option would be to set up an ad-hoc tv-to-tv network. You might not let your TV talk to the internet, but I bet your neighbour does, or if not, then their neighbour will.
The “Anti-Fraud Community Group” already thought of that:
https://github.com/antifraudcg/proposals/issues/17
Device mesh (Androids/Chromes) to share suspicious behavior
The proposal is to use the consensus between devices on genuine and suspect characteristics
A device should be able to query from a safe and reliable source if another device has performed (within a defined period of time) some malicious action similar to the one it is going to perform, so it could make the decision not to perform that same action, autonomously.
…just in case you wanted to install
an ad blockermalicious software, or something.deleted by creator
It’s amazing how Batman Forever predicted the then-future of television, up to and including most people trading in security/privacy for convenience.
1-3% of users might not be enough people, but what is the break-even % of people to justify adding a cheap cellular modem? 5%? 10%?
You are likely not even doubling the cost of the data collection operation. We’re talking under $0.50 in additional hardware per unit, with a relatively low data usage requirements. The servers to collect that data are likely already more expensive, and you can easily sell user viewing habits for way more than $1/month/user. You can use a prepaid low usage data-only eSIM with global roaming for less than $5/year, only renew it for the devices that don’t get hooked up to a user’s WiFi. If it was only needed for 5% of the users, or 1 in 20, you could still get a ROI of under a year.
With a device life of 5+ years, it’s definitely much more than a rounding error. Keep in mind the profits go directly to the manufacturer, so it’s a % of product cost in origin, not of MSRP… which is pretty much the reason why all manufacturers have jumped onto the data collection bandwagon in the first place.
That’s what they do with CPAP machines.
My CPAP is always in airplane mode. Hopefully solved that problem.
I feel like the market is only going to grow in the top end. Audio/videophiles sort of areas with large, high quality, top end feature sets.
The low end tends to be partly subsidized by the “smart” features. Think TVs that show ads in the menu, or Amazon or Google screens that want you to use their services because it’s “easy” and they’re “right there” so maybe people will subscribe. Couple that with the “feature” that it’s already built in so it saves you an extra box/purchase for people who want cheap TVs, and I don’t see it going away anytime soon.
Exactly this.
Manufacturers are NOT INTERESTED in selling low-cost dumb TVs when they can sell smart TVs and get long-term returns. They are even willing to sell the TVs at cost because they will monetise later with ads and selling your data.
Manufacturers don’t want you to have a dumb TV, they want everyone to go smart - which is part of why business-targetted dumb panels are priced higher - to disincentivise regular end-customers from buying.
oh…is that why all these nice smart TVs are so affordable these days?! damn!!
Normal manufacturing efficiencies and cost reduction is surely the biggest reason they are cheaper now but it’s absolutely a factor.
So many companies in so many industries are trying to move from being product companies (make money selling a thing) to being service companies (make money from subscriptions, user data and other monetisation) and I’m doing my damnedest to keep away from any of it.
It could get interesting with right to repair, that probably includes the right to load custom firmware…
There’s no down-side to selling a smart TV to someone who doesn’t want one/doesn’t use the features.
The features we “want” from modern TV’s like DolbyVision and all the shit they do the image to make it stand out in the store requires a significant amount of processing power.
It’s simply better business to sell smart TV’s to everyone than to make dumb TV’s that compete for a tiny fraction of the market when people buy Smart TV’s in every price segment.
The paradox being that if therr were “premium” smart TVs for people like us - with proper support, privacy, customization options and no crap like ads - we’d probably buy it, and pay a premium for it.
But that’s just too much work for them and they probably don’t even realize that kind of market exists.
I think they know it, I just don’t think they care. It’s a niche market. On top of that, they’d have to convince the people in that market to trust them.
If they can get a 10-20% return on 10,000 Smart TVs, why waste the effort on properly developing and supporting 3 PrivaTVs (patent pending, exclusions apply, see your local drunk for details)?
I could be wrong, I just don’t think the market is large enough that they’d be willing to throw manpower at it.
I think people underestimate the value of their tracking data. For a manufacturer, the benefits over the lifetime of the device, can be way higher than 20% the manufacturing costs.
They could still develop and support those 3 PrivaTVs, but the MSRP would easily be a few times higher than that of an equivalent Smart TV.
How many more times? If a regular TV costs $100 then they’re making $20k plus marketing data. The PrivaTV would need nearly a $7000 markup for the same return.
Obviously these are made up numbers for illustration. I think that for big manufacturers it’s not worth it for the return and amount of effort they would need to spend. Maybe a small manufacturer could do it. Maybe that would spur the big guys to buy them out and take it over once the hard work is done.
A TV manufacturer doesn’t need to develop that PrivaTV from scratch, they can get their SmartTV and just rip out the Smart part, for a much lower markup.
A big manufacturer is one who’d have it easier; just need to make “privacy” into a selling point, then slap a “Private” sticker instead of the “Smart” one.
Hopefully with the “right to repair”, we might see some people ripping out the smarts out of a SmartTV, possibly just flashing an updated firmware, so that might convince manufacturers to give it a go too.
I was specifically talking about what the original commenter said.
with proper support, privacy, customization options and no crap like ads
Dumb TVs are already a thing as mentioned elsewhere. Commercial Displays cost more but you can beat someone to death with them and they’ll still work.
I’m with you on hoping for more options. I’d hate for my next TV purchase (hopefully years from now) to be forced online under the guise of firmware updates to steal my viewing habits.
I think you’re right mainly in that what they’re doing now is sure and easy money. Why risk it, right?
They aren’t very good though. They are durable, but usually expensive and missing a lot of features you might actually want for that price tag. For example, I’ve yet to find any OLED “commercial displays” that support Dolby Vision, VRR, and eARC.
It’s way cheaper and easier to just buy the TV you want and not connect it to your wifi.
Computer monitors should work too, and are more readily available. Just dig through the business oriented monitors and ignore the gaming ones, as cable providers aren’t really going to have anything that can take advantage of >60 fps display rates.
My personal experience with computer monitors is that they work great except they always seem to cheap out on speakers if they have built in speakers. Tiny, tinny things whose volume is always way too low.
I don’t mind having separate speakers, but once in a while it would be nice to not need them.
I don’t think I’ve ever heard what my TV speakers even sound like. I’ve never used them.
Same, I think I never used them, when I bought my latest TV I already had my good old 5.1 system
Even on a high end TV the speakers are going to be bad. It’s just there to check a box. TVs are so thin that you cannot physically fit in speakers large enough to sound good.
A cheap sound bar will make a huge improvement to audio quality over any built in speaker system.
Right, but at that point, may as well just invest in a fucking PC monitor. Like what else is a TV really bringing to the game that a monitor can’t?
Like, if they can’t put in speakers worth a damn, that’s the point of even including them?
Like what else is a TV really bringing to the game that a monitor can’t?
A tuner and a remote control.
Find me a reasonably priced 70” monitor and i will hail you as the next coming of christ. That is the holy grail for me.
Size and picture quality.
Since I’m going to be skipping the TV part with my HTPC, then why not simply use a computer monitor. Nowadays you can also get a 40+” monitor, and that should be big enough for most people. These things might not even have any speakers, so you might need to plug it into an audio system to make it all work.
The other option is to buy the smart TV, turn off the networking, and hook it up to a Shield, Apple TV, or Roku. All those box makers are going to support the devices longer than TV manufacturers, and the streaming apps can’t ignore them.
so is using something like an Apple TV or Roku box actually more secure than just using the apps directly on the TV?
Yes, because streaming boxes can be upgraded independently of the TV and so you can always have hardware that’s actively supported. My old Roku 3 was still getting updates as of a few years ago, while my “smart” TV from 2015 stopped getting security updates long ago.
Last time I looked for commercial dumb TV, a SHARP was like $4000 for a 65" 1080p or something :-/
$910 for a 65" 4k Samsung display.
https://www.samsung.com/us/business/displays/4k-uhd/qe-series/qe65t-series-65-lh65qetelgcxgo/
Not bad, I’m in Canada I’m wondering if i could find it, but I’d like the 75" one, at about 2k US, I guess a Sony from Costco would have better pictures
However, the other solution is the one you’ve already mentioned where you never plug the Smart TV into the internet, and instead bypass the “smart” on the TV with your own streaming boxes.
I did this for a long time on my old Vizio TV, but the experience was notably worse with external devices compared to built-in, due to the limited framerate support over HDMI. This led to awkward juddering when e.g. trying to play 23.976fps movies with only 30hz or 60hz output. It also meant built-in video features like motion interpolation did not work effectively.
I guess this is less of an issue today with VRR support on high-end TVs, but still, a lot of devices you might connect to a TV don’t support VRR.
Your streaming box was either not configured properly, or was very low cost.
The most likely solution is that you need to turn on a feature on your streaming box that sets the output refresh rate to match that of the content you are playing. On Apple TVs it is called “match frame rate”. I know Rokus and Android TV devices have similar options.
Newer TVs can detect when 24 fps content is being delivered in a 60 hz signal and render it to the panel correctly, but this doesn’t usually work if you have the selected input set to any low-latency modes (“Game”, “PC”, etc)
Good to hear newer devices support this.
My experience was from quite a few years ago (2015ish). At that time, there was no such feature in any of the devices I tried connecting, including a few brands of Android phones, Fire TV sticks, and MacBooks. I remember reading into the documentation on other devices at the time to find something better, with no luck. That said, documentation was pretty poor all around so who knows? The most useful info I found was in threads on VideoHelp or AVS forums where other users reported similar issues on various devices. Android TV was still very new and very shitty back then.
At this point I would simply not buy anything that doesn’t support VRR.
This is one of the downsides of the widespread adoption of HDMI, it has quite a few downsides. Something like display port would be better, but it’s far less common. Such is life.
How is this a downside of HDMI?
It sounds to me like the user’s TV or streaming box are configured incorrectly. DisplayPort doesn’t magically remove judder from 24fps content being rendered to a 60hz signal.
DisplayPort never saw widespread adoption in the home theater space because it never tried to. The standard is missing a ton of features that are critical to complex home theater setups but largely useless in a computer/monitor setup. They aren’t competing standards, they are built for different applications and their featuresets reflect that.
Newer revisions of HDMI are perfectly good, I think. I was surprised and dismayed by how slow adoption was. I saw so many devices with only HDMI 1.4 support for years after HDMI 2.0 and 2.1 were in production (probably still to this day, even). It’s the biggest problem I have with my current display, which I bought in 2019.
GP’s problem probably isn’t even bandwidth, but rather needs to enable their TV’s de-judder feature or configure their streaming box to set the refresh rate to match that of the content being played.
VRR support came with HDMI 2.1.
You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.
VRR is really meant for video games.
You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.
That’s interesting. Pretty much every Blu-Ray player should support this. I can confirm from firsthand experience that Apple TV, Roku, and Android TV devices also all support this. I can’t speak for Amazon’s fire stick thingy though.
The feature you are looking for is not to manually set the refresh rate, but instead for the device to set it automatically based on the framerate of the content being displayed. On Apple TV it’s called “match frame rate”.
This is good to know, thank you for the info. I am getting worried about my increasingly old TV (15+ years) and I do not want a smart TV to replace it.
Is it just me or is it really fuckin’ easy to not connect your TV to the internet?
I’ve hated “Smart TVs” for a decade now, but I solved my problem by just buying a set top streaming box (Apple TV, Nvidia Shield, etc) and leaving my TV off my WiFi.
Some smart tvs’s will whine incessantly about not having the internet.
Thankfully mine (Philips) only bitched about it for about a week, and gave up. Now the only real complaint I have with it is that it takes forever to boot, considering it has to fire up android after it’s been off.
LG doesn’t do this. They also have the good sense to allow firmware updates via USB. Which is great, because turning on WiFi long enough to install an update fills the home screen with junk.
I have an LG that is a couple years old. Never connected it to the Internet and don’t intend to but have wondered about firmware updates for it. I am afraid of an update adding ads or something else I don’t want. What is your experience? Or is there a resource that details everything (and I mean everything) that the updates change?
I’ve installed two firmware updates on my C1 and they have never added advertisements. I installed them because they both fixed specific bugs I was experiencing with my home theater system.
I don’t see why they would try to shove ads in an offline firmware update when it is both easier and more useful to download them from the internet once the device is connected. It’s hard to make money from ads when you can’t actually track user engagement.
That said I would only bother updating your TV’s firmware if there is a bug fix or feature you need from a newer version.
I also have a C1 and have been annoyed that it won’t turn on my connected AVR when I turn on the TV even though it has the capability and it turns it off when I turn the TV off. This wouldn’t happen to be one of the bugs you upgraded to fix, would it? What bugs did you encounter that you fixed with firmware upgrades?
Not OP, but what you’re referring to is called CEC control. Maybe just have a dig through your settings on both devices to check CEC is enabled.
is called CEC control
LCD display
HDMI interface🤪
I am aware of the CEC settings and they are working - the TV will power off the device just fine using CEC and it has the ability to power it on (I can manually trigger this) but the TV does not send a power on command to the AVR automatically when the TV is turned on. This seems to be a known issue but I don’t have a link to the forum discussion I found a while ago where others have the same problem.
The bugs I was having were related to eARC not working properly when G-Sync was enabled on my PC. I haven’t had any problems with my C1 not responding correctly to “one touch play” CEC signals from my PS5 or Apple TV.
Well I’m glad you got them fixed!
A good place to check is avforums or avsforums. There have been a lot of CEC and ARC issues (on all brands!). And a lot of people discussing the different updates (while laughing at the useful release notes)
I personally found that CEC power on only worked when I had the amp input selected, or used ARC.
Is it just me or is it really fuckin’ easy to not connect your TV to the internet?
I prefer not to reward corporations by buying equipment with built-in spyware.
(Also, “easy to not connect” depends on whether the TV nags you, or disables features, or uses any open wi-fi it finds, or includes a cellular or mesh modem.)
You’re just giving the same companies even more money when you buy their much more expensive “dumb” digital signage products.
Nobody’s been able to show me a TV that actually does those other things you suggest. If one did, I wouldn’t buy it, but I won’t base my current pruchasing decisions on hypothetical future products.
You’re just giving the same companies even more money when you buy their much more expensive “dumb” digital signage products.
No, I am not.
(And even if I was, it wouldn’t boost the sales numbers of spyware products, encouraging more of the same.)
Nobody’s been able to show me
If you don’t want to believe it’s a problem, I don’t expect anyone wants to waste their time trying to change your mind.
(Jay did report seeing examples in the wild, though.)
No, I am not.
Who do you think makes these digital signage products? They all come from LG, Samsung, Hisense, etc.
If you don’t want to believe it’s a problem, I don’t expect anyone wants to waste their time trying to change your mind.
Show me a TV that ships with a cellular modem or that connects to open wifi networks without being prompted, and I won’t buy it. I’m not the one with the burden of proof here. It’s very easy to see if a TV does any of this shit before you buy it just by checking reputable review sites like rtings. So telling people any TV they buy at Costco does this is just spreading FUD.
So telling people any TV they buy at Costco does this is just spreading FUD.
Nobody has said that.
This honestly and embarrassingly didn’t occur to me.
I got a roku for my smart TV because I wanted something with a Jellyfin app. I don’t trust roku any more or less than Vizio, but I find I like the idea of removing internet access to the TV directly.
Yeah they can try and push as much BS as they want but in the end I’m never using their software and always using a shield
Back in 2019 I wanted a nice LED screen with high resistance to screen burn but the only economic option was a Samsung Smart TV.
I actually ended up getting it, ordering a custom mount for the ARM Chip, and using an input method on the chip that makes it run Java natively so that I could make the Smart TV drop it’s firmware onto a USB and from there I could modify it, since it was just running a version of Linux.
So that’s the story of how I un-smarted my TV. Get fucked, Samsung.
What a fucking ridiculous workaround that’s completely unavailable to the regular consumer…fuck Samsung (and the industry in general) for this approach.
Well, now that the community has had a few years to reverse engineer it I assume there are a lot of better and easier ways to replace the firmware with an open source quick fix. So, it’s not like my way is the only way. It was just necessary at the time. In fact, the community worked very fast to find a way to hack these “System on a Chip” architectures since the ARM chips were first released. They’re used in Macs, phones, TVs etc and have a very high power efficiency, but it is a very clear design choice to make them extremely difficult for the user to access and customize.
Unless it’s “load this file on a USB stick and plug in in your TV”-easy, it’s still out if reach to most consumers.
But my point is, it shouldn’t be necessary to do these things in the first place. Fucking drop the “smart” element from them completely, they always suck ass anyway and are laggy as hell to navigate.
Smart devices are basically data sniffers scooping up any info about you and your family, your habits,. They watch network traffic, listen to your conversations, and record video,. I’ll stick to dumb devices thanks.
deleted by creator
Oh yeah I put my meat in your fridges mom
deleted by creator
Sony Bravia models now give you an option to make it a dumb TV as part of the out of the box experience. It’s the first question they ask you when you power it on.
Wait, really? Is this also the case for older models? I have one but it’s already a few years old. 🤔
My X950H does not give the option (although there is a hidden dev/“pro mode” that allows you to turn it into a dumb screen I think) but my newer A80J model does give the OOTB option to disable the smart features.
The Bravias with Google TV are at the absolute limit of what I will tolerate from a smart TV. Suggestions/tailored stuff on the Home Screen, but no invasive ads. Anything further and I’d turn them into dumb TVs and use an Apple TV or Google TV dongle instead.
I got a display signage TV. Totally dumb. The only app it has is YouTube and that’s optional. I don’t even have the internet hooked up to it. Works fine for gaming and occasionally streaming via other devices.
Where do you even get something like that?
I got mine through Amazon. Samsung makes the cheapest ones I’ve found. Just search for something like “samsung commercial TV”. They’re generally a little more expensive than your ad/data harvesting-supported TVs but if you value your privacy and longevity of your devices, it’s worth it.
Also, these industrial monitors have better heat sinking from the LED backlight, which increases power efficiency and service life – the two metrics their intended buyers care most about.
deleted by creator
I have solved this by not buying a TV in the last two decades. I just own projectors. Larger screen, cheaper, no “smart” nonsense. Depending on mounting, essentially invisible when not in use and not a large black rectangle in your living room. Do recommend.
How dark do rooms need to be for them to work? Are there issues with shared spaces where someone might want a well lit workspace?
I also want to know this
Having the sun shine through a large window is an issue, but is also an issue for a good picture on normal TVs. Picture quality with protectors is better when the room is darker (increases contrast), but a normally lit room is just fine. It also depends on how and what you’re watching. I generally do darken the room when I’m actively watching a movie, but no need for that when putting something on you’re just half watching. You can still tell just fine what’s going on even in a bright room, it just looks a bit washed out.
It also depends on the brightness/class of the projector of course, and on the screen. Don’t underestimate the visual difference a screen makes. Both having any screen over just projecting onto a white wall, and a great screen over a cheap ransom one.
The core issue is that a projector uses throwing light as bright, and not throwing light as dark. If your surface (screen or wall) is rather white and illuminated without the projector actually projecting light into it, that is as dark as a black part of the picture could possibly be. There are screens that are reflective, but more gray than white, those help with that, too.
I would say a normally lit room (with artificial light in the evening for example) is fine to use a projector. “Well lit workspace” really depends on you’re definition. For my definition of “well lit” it wouldn’t be ideal, but I’ve just installed like 49000 lumens of illumination into my 3.5 x 3.5 meter workshop, cause I like to see what I’m doing and life is too short for bad lighting.
Thanks, that’s a lot to think about. We currently use an oled computer monitor as a TV (hooked up to a pi) and it’s beautiful but there are limits on screen size and it’s crazy expensive (you’re paying for stupid fast refresh rates and the Gamer™ markup)
our house is very bright during the day, lots of glass in sunny Australia, so it’s probably not a great candidate for a projector generally but it does have me thinking about one in the bedroom for late night movies. Probably a lot cheaper and neater than another absurd monitor.
My last projector came with a low power android TV stick. I thought that was pretty cool, even knowing that I’d never use it. That means the smart TV features are there for the people that want them and can literally be thrown in the garbage for those that don’t.
I just run an HDMI from my computer…
I think you might have AngrilyDoublePosted lol
Yes. IMO if you want a smart tv just connect your laptop or computer to it. I’ll never buy a “smart” tv
I’ll probably need to buy a new TV in a year or two. I read there are some ways to flash custom firmware on it.
Just get a monitor. The only real difference between a monitor and a TV these says is the lack of a speaker, and “smart” stuff. But TV speakers suck anyways so you’d be better off using a soundbar regardless.
I haven’t seen >40" monitors at a reasonable price though compared to TVs
Edit: also, there’s usually not any audio output to an amplifier on monitors.
As I mentioned earlier, use a soundbar or dedicated speakers (most TV speakers suck anyways). Also, for a reasonably priced monitors, look for monitors marketed as “commercial displays” - they’re generally the same price or even cheaper than a similar spec’d TV.
As I mentioned earlier, use a soundbar or dedicated speakers (most TV speakers suck anyways).
Yes, but there is no audio output (as in a RCA, Optical etc., not built-in speaker) to get the audio from the monitor to the amplifier.
Depends on the monitor, my one has a 3.5mm jack to get analogue audio out of the HDMI input which I use to get audio from my Xbox to the rest of my setup.
None of the >40" monitors I’ve looked at today had any audio outputs. But finding one that isn’t an ultrawide format for gaming is probably the bigger issues it seems.
??? The output is provided by whatever box you’re connecting to the monitor - set-top box, Android TV, Apple TV etc.
Not true at all…e.g. Chromecast doesn’t have a dedicated audio output and neither does the apple TV, they only have HDMI output. Now the HDMI does also carry audio, but many amps and especially sound bars, do not have HDMI to pass through and rely on getting the audio signal from the TV/monitor if you’re using those devices.
There are plenty of HDMI switches or splitters out there that support audio extraction, just use one of them to sit between your monitor. Like this one: https://www.amazon.ca/gp/product/B00XJITK7E
And big monitors come with TV software nowadays…
Can you get monitors in 50 inch these days though? Then this would be my route as well once my current dumb TV dies
Yes, Viewsonic for instance is one company that makes them. Although, they’re typically advertised as a “commercial LED display” or something like that. Basically look for “display” instead of “TV”.
Yep, they’re horrible. I always disable internet on them, uninstall any apps I can, and generally do what I can to avoid using the built-in smart TV, but I shouldn’t have to do this, its unfortunate and sucks to deal with. They just take advantage of consumers who don’t know better, wish the TV market wasn’t like this. :/
Just wanted to say same. I have used a Linux box as my Media Center and Home Server since 2008. Also have a chomecast dongle so I can steam from Android and Android apps. Not sure what else one needs.
Seems to me what one wants wants really is mostly a browser and ability to stream stuff from apps on your phone. Since the Linux box is a Media Center and Server it also has a lot of features a Smart TV would not have. Just do not see the value of a Smart TV.
I just saw a link to this
https://github.com/MayaPosch/NymphCast
Perhaps your l8nux can cast as well?
Pro Tip: Buy a Computer Monitor e.g. 4k 34 inch
they don’t have any smart tv shit, but you need to buy some extra for the audio
that’s a lot more money for a smaller screen, though. 32" is a big monitor sitting in front of a desk, but a small TV if you’re on a couch.
you can buy “gaming” monitors that are 48" big for example https://x-kom.de/653504-gigabyte-aorus-fo48u-475-zoll-4k-gaming-monitor-hdmi-21dpusb-c-120hz-hdr-oled (caution! german website)
or this one: https://www.lg.com/de/monitore/lg-48gq900-b
Idk at which point they are starting to implement some kind of “smart tv”
caution! german website
Is there something scary about German sites that I’m not aware of?
jesus what are those prices
deleted by creator
That second one is apparently sub-ms latency, which is incredibly unnecessary for a TV.
If you mean the 0.1ms, that’s not latency. That’s grey to grey speed which is practically meaningless. The image can’t get from your PC to the monitor in 0.1ms.
The fastest monitors I can find have a real latency of 1.7ms, and that’s a 1080p monitor running at 360Hz. When you get down into the 4k 120Hz range that the top TVs do, the speed falls to around 5ms, which is about the best you’ll get from a TV.
“gaming” and of course NO SMART SHIT
Many features that come with TVs these days are not available on monitors. For example “filmmaker mode”
Monitors are effectively always in ‘filmmaker mode’, as they don’t do frame interpolation and colour grading and over-scanning and all the stuff that filmmaker mode disables.
I don’t think there are many monitors that support 24fps natively.
A 120hz or 144hz (reasonably common) monitor could do 24FPS with no issues, as that’s either every 5th or every 6th frame exactly.
Im not sold in the idea. I did find a couple of OLED 4k 120hz monitors on ‘TV size’ (BenQ MOBIUZ EX480UZ ), but then they dont seem to have (or at least advertise) hdmi-arc or hdmi-CEC support, and the brightness is only 480 nits (vs 800 of a LG C2) and it seems more expensive.
I would not recommend it as a replacement for an actual TV in a living room.
Samsung commercial displays don’t have any smart features, I think.
They don’t come with a stand.
https://www.samsung.com/us/business/displays/4k-uhd/qe-series/qet-series-43-lh43qetelgcxza/
I’d much rather have a VESA mount than a stand. VESA stands are easy to find and cheap, and if you want to wall mount or get a fancier stand that is an option.
The specs on those QET models don’t look great. 300 nits max brightness? That can’t be right. Aren’t they designed to be in brightly lit areas like inside stores?
As a lot of people here, I did the same, bought the smart TV, it needed internet for firmware upgrade, and once it had started and did not ask for my inputs or whatever, I selected the HDMI1 as startup, plugged a Chromecast. Then went into the TV menu to forget the network settings on the TV. It’s just a monitor used to cast Netflix, Disney, Plex, Prime, etc.
How is plugging in a Chromecast any different than using the same software built into a Google TV?
As the owner of original $500 Sony NSX-32GT1 TV from 2010, I can tell you exactly how different it is. Its UI wasn’t exactly a snappy experience to begin with, but it’s gotten even more sluggish over the years until everything stopped working due to being EOL. The OS on it has been unsupported since around 2016, so it’s stuck on ancient Android TV version. Most apps (even built-in ones like YouTube) stopped working a few year later, and cannot be updated.
Sure, a $50 Chromecast will eventually suffer from the same problems, but I can replace it 10 times for the same amount of money while keeping my TV because its 32-inch 720p panel still displays content just fine.
While true, the context of this discussion was mostly about privacy rather than functionality. Builtin Chromcast vs external changes nothing on that front. There is also nothing that would prevent you from plugging in that fancy new external Chromecast to the old Sony and getting new functionality from it if the display is still to your liking.
I don’t know for a Google TV, but others brands have microphone and ads in their builtin …
I just never entered my Wifi details into my smart TV. I only use the HDMI inputs on it anyway, so it behaves like a dumb one. It’s a RCA TV from Walmart, if anyone is wondering.