Comcast says it represents a 10 Gigabit cable internet network they are building (it doesn’t exist) so they are basically changing the meaning of the g from generation to gig to act like 10g is 5 generations better (or twice as fast)…or that they have a 10 gigabit network. Neither is accurate. It’s still just cable internet that people have to use because they have no other option.
Fuck Comcast.
I read online they are abandoning the “confusing” 10g branding but I just saw a commercial for it. They think all of their customers are morons and count on folks having no other choices in a lot of cases.
Apologies to anyone outside the United States, this is just complaining about our poor internet options and deceptive advertising by greedy corporations.
There are 4x the pixels so…
I don’t disagree with the change either. Having a large number makes it more difficult to compare. After 2160 it’s 4320. 2k, 4k, and 8k are far easier to remember and figure out the differences.
In my past life I was a video editor while 4k was still at its infancy, and my coworker was furious saying that reporters were idiots for saying that 4K was 4 times the size of HD, it was just the name. And I’m like, dude is actually 4 times more and show him a picture of the size comparison of both and he was really ashamed, but I told him it was ok because I was also thinking the same until I read an article about it.
deleted by creator
My interpretation is that 1080p is 1k and these are 4 times as many pixels, so it kinda makes sense? Not really, but w/e.
Exactly. The up tick in resolution was slow, 360 to 480 to 720 to 1080. Relatively small improvements. Then we jump to 2160/4k and the resolution goes up by 400% from the previous 1080. 4k is 4x1080 screens put together.
Totally agree, but then
Is internally inconsistent!
If 4k is four times the pixel count of 1080, then 2k means 1440 (-ish, it should be 1530) - that’s fine. But then 8k must be 3050, but it is actually 4320!!!
So it can not refer to the number of pixels (quadratic scaling). On the other hand, if we assume linear scaling and 8k is 4320 and 4k is 2160, then 2k is 1080 - but 2k is never used in that context!
Edit: as you can see I’m very passionate about this XD
Eww, who refers to 1440p as 2k? 1080p is 2k. Not that anyone really says 2k to begin with.
That said, that particular instance is irrelevant as long as things are consistent going forward.
2048x1080 is DCI 2K.
The slight difference between the ratios is why home releases of films often have small black bars at the top and bottom, as the DCI flat ratio is slightly different than 16:9.