• Bgugi@lemmy.world
    link
    fedilink
    arrow-up
    58
    ·
    8 months ago

    Fuck that, instead of making them increase their imaginary “up to” numbers, make them advertise contractually guaranteed minimums. Id rather have a 25 mb minimum over a 100 mb maximum that usually sits around 8 mb.

    • Rapidcreek@lemmy.worldOP
      link
      fedilink
      arrow-up
      17
      ·
      8 months ago

      When I bought internet services and colocated with major carriers every contract came with a Quality of Service rider that stipulated guaranteed quality and quantity of service. If my metrics fell below those minimums I had recourse. But, I could not extend that to my customers because they were using a shared resource I was providing. In general, though, I agree that there should be a QOS with every user connection.

    • mosiacmango@lemm.ee
      link
      fedilink
      arrow-up
      51
      ·
      edit-2
      8 months ago

      A 4x increase for download and a 7x increase requirment for upload.

      That’s a pretty solid improvement, honestly. They also have plans on whne to increase it to 1Gbps down/500Mbps up, so it seems like they are taking it seriously.

      • umbrella@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        my third world country’s internet has a minimum of 100mbps on most internet plans in the cities.

        100mbps in the supposed best country in the world is shit, no matter how higher it is than 2003 standards.

    • Montagge@lemmy.zip
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      8 months ago

      lol I’ve never had anything over 12Mb/s. Currently have 8Mb/s, which costs roughly half than what I use to pay for 500kb/s

      I would love to have 100Mb/s. Hell even half that.

      • ripcord@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        8 months ago

        It’s interesting. I have a remote place (not where I live) in the least populated, podunkest county in the state (which is saying something). And we were still able to get fibre and 50Mbps out there (and it could be higher, but not really worth the extra money since it’s rarely used).

        Still within a couple hours of a big city, though. Guessing you’re further away than that, or something?

        • Montagge@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          The 500kbps was 15 minutes outside of a metro area of 2.5 million lol

          It was decades of CenturyLink making sure no one else moved in on their turf.

          Where I’m at now the fiber is a couple of miles away and no cable, but 8Mbps feels lightning fast after CenturyLink lol

      • hperrin@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        That’s enough to watch exactly one 1080p 30fps stream on YouTube and literally nothing else.

      • ji17br@lemmy.ml
        link
        fedilink
        arrow-up
        18
        ·
        edit-2
        8 months ago

        Mbps = Mb/s = Megabits per second.

        MBps = MB/s = Megabytes per second.

        The p is just the /. It’s the capital or lowercase B that makes the difference.

          • ji17br@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            As a computer engineer, I had better know. And don’t get me started on MiB vs MB

              • ji17br@lemmy.ml
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                8 months ago

                kB = kilobytes = 1000 bytes

                MB = megabytes = 1000 kB

                kiB = kibibytes = 1024 bytes

                MiB = mibibytes = 1024 kiB

                Generally on hard drive/ssd capacity it will be listed in GiB (Gibibytes). This is the reason a 1 Terabyte drive is actually something like 931 GB showing in your system. Because your system uses GiB and the manufacturer uses GB.

                1GB = 1,000,000,000 bytes

                1GiB = 1,073,741,824 bytes

                1 GB =~ 0.931 GiB

                Edit: I had it backwards, it is fixed now

      • Dran@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        8 months ago
        • 3.125MB/s to 12.5MB/s

        He is right though on megabits to megabytes. Internet speed is advertised in bits/s where files and transfer speeds are usually shown in software as megabytes/s

  • TechNerdWizard42@lemmy.world
    link
    fedilink
    arrow-up
    28
    ·
    8 months ago

    100mbps symmetric should be minimum standard. 100mbps down with 10mbps up is worse than remote islands with mud huts. Seriously, I was on a Pacific island that looked like what an after hurricane photo op does, and they had direct access to the fiber cables. So gigabit symmetric internet ONTs glued to the side of huts for a few bucks a month.

  • Coskii
    link
    fedilink
    arrow-up
    24
    ·
    8 months ago

    Cool, now make them use bytes as the system of measurement and we’ll be on to something.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      11
      ·
      8 months ago

      I fear that will only happen when storage manufacturers are forced to use 1024 bytes per KB like everyone else.

      In fairness it’s a very longstanding tradition that serial transfer devices measure the speed in bits per second rather than bytes. Bytes used to be variable size, although we settled on eight a long time ago.

      • pafu@feddit.de
        link
        fedilink
        arrow-up
        6
        ·
        8 months ago

        1024 bytes per KB

        Technically, it’s 1000 bytes per KB and 1024 bytes per KiB. Hard drive manufacturers are simply using a different unit.

      • AProfessional@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        8 months ago

        Base 10 is correct and more understandable by humans. Everyone uses it except Windows and old tools. macOS, Android (AOSP), etc.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          13
          ·
          8 months ago

          Found the hard drive manufacturer.

          It’s 1024. It’s always been 1024. It’ll always be 1024.

          Unless fo course we should start using 17.2GB RAM sticks.

          • QuaternionsRock@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            8 months ago

            There’s a conflict between the linguistic and practical implications here.

            “kilo-“ means 1,000 everywhere. 1,000 is literally the definition of “kilo-“. In theory, it’s a good thing we created “kibi-“ to mean 2^10 (1024).

            Why does everyone expect a kilobyte to be 1024 bytes, then? Because “kibi-“ didn’t exist yet, and some dumb fucking IBM(?) engineers decided that 1,000 was close enough to 1,024 and called it a day. That legacy carries over to today, where most people expect “kilo-“ to mean 1024 within the context of computing.

            Since product terminology should generally match what the end-user expects it to mean, perhaps we should redefine “kilobyte” to mean 1024 bytes. That runs into another problem, though: if we change it now, when you look at a 512GB SSD, you’ll have to ask, “512 old gigabytes or 512 new gigabytes?”, arguably creating even more of a mess than we already have. That problem is why “kibi-“ was invented in the first place.

            • Semi-Hemi-Lemmygod@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 months ago

              It’s not just the difference between kilo- and kibi-. It’s also the difference between bits and bytes. A kilobit is only 125 eight-bit bytes, whereas a kilobyte is 8,000 bits.

        • Blaster M@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          8 months ago

          Computers run on binary, base 2. 1000 vs 1024, one is byte aligned(2^10), the other is not.

          • AProfessional@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 months ago

            Thats an irrelevant technical detail for modern storage. We regularly use billions, trillions of bytes. The world has mostly standardized on base 10 for large numbers as it’s easy to understand and convert.

            Literally all of the devices I own use this.

    • ripcord@lemmy.world
      link
      fedilink
      arrow-up
      23
      ·
      edit-2
      8 months ago

      I have symmetric 1Gbps and do a LOT of data transfer (compared to 99.99% of people). And even then I rarely really would need or even notice more than 100Mbps.

      For most people, in the real world, why is 100Mbps “very slow”?

        • rbesfe@lemmy.ca
          link
          fedilink
          arrow-up
          11
          ·
          8 months ago

          The vast majority of people are not downloading multi GB files frequently

          • hperrin@lemmy.world
            link
            fedilink
            arrow-up
            5
            ·
            edit-2
            8 months ago

            This isn’t really true. An HD movie on Netflix/Hulu/Prime/etc is multi GB. It just doesn’t need to download fast, because anything faster than the bitrate of the movie won’t be perceptible.

            But there are also games on platforms like Steam, Epic, PlayStation, etc. These are often very large.

            • frezik@midwest.social
              link
              fedilink
              arrow-up
              4
              ·
              8 months ago

              For context, a 4K Blu-ray disc has a maximum transfer rate of 144 Mbps. Most streaming services are compressing much more heavily than that. Closer to 20 or 40 Mbps, depending on the service. They tend to be limited by managing datacenter bandwidth, not end user connections.

              While I get that people hate having to download big games over 100Mbps, it’s something you do once and then play the game for weeks.

              • hperrin@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                8 months ago

                So build the capability and people will use it when they need it. My point still stands that 100Mbps is slow, even if most people are fine with it day to day.

                Also, for a family of four, that would mean only 2 of them would be able to watch a 40Mbps HD stream at once. I get that that is relatively rare for 3 people in a family to want to stream at that speed at the same time, but I wouldn’t call something fast if it can’t support even that.

                (YouTube recommends a bitrate of 68Mbps for 4K 60fps content and 45Mbps for 4K 30fps. Higher when using HDR.)

                • frezik@midwest.social
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  edit-2
                  8 months ago

                  Where I’m going with this is that there are much more important things than going significantly over 100Mbps. Quality of service, latency, jumbo MTU sizes, and IPv6 will affect you in many more practical ways. The bandwidth number tends to be used as a proxy (consciously or not) for overall quality issues, but it’s not a very good one. That’s how we’ll end up with 1Gbps connections that will lag worse than a 10Mbps connection from 2003.

            • Blackmist@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              8 months ago

              Just updates running in the background use an enormous amount, let alone full game downloads.

              Twitch and Youtube use a decent amount per hour as well.

          • VieuxQueb@lemmy.ca
            link
            fedilink
            arrow-up
            4
            ·
            8 months ago

            I use to think that until I spent a bit of time with a gamer. 75Gig updates etc… the fuck is in those game ! The whole Netflix library?

            So Games, 4K videos etc…

        • vithigar@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          A file large enough to take hours, plural, at 100Mbps is more than 90GB. Doing that regularly is definitely not normal usage.

          • michael_palmer@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            Average 4K BDRIP movie is 60 GB, average AAA game is 60-100+ GB. So you are saying that watching movie once a week and downloading one game is not normal? Using 1 GBit Internet means saving 3-6 hours of time per week.

            • vithigar@lemmy.ca
              link
              fedilink
              arrow-up
              6
              ·
              8 months ago

              Watching movies and playing AAA games is normal, sure.

              Downloading 4K BDRIPs and a new AAA game every week definitely isn’t. Most people probably stream their movies, and even those prone to pirating their content are likely downloading re-encoded copies, not full sized BDRIPs.

              On top of that, it’s not like you have to sit there and wait for it. You’re only really saving that time if it’s time you were going to spend sitting and staring at your download progress instead of doing something else.

              I’m not saying edge cases don’t exist where someone would notice a real difference in having >100Mbps, but it’s just that, an edge case.

              • michael_palmer@lemmy.sdf.org
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                Most of the time, the idea to watch a film comes to me quite suddenly, so I have to wait until the film is at least partially downloaded before I start watching it. And even downloading an app from the repository takes 10 times less time. And 1000 MBps internet is only 5-10 euros more expensive than 100 MBps.

    • morbidcactus@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      It’s amazing how much our views change with time. My dad was definitely a super early adopter of cable when it became available in our area, if I recall it was 16 Mbps which was unreal to me in 2002. I made do with 5 Mbps in uni and it was totally usable.

      But now, I’ve had 1Gbps for years and wow it’s so different, changes your habits too. I don’t hoard installed games as much, I can pull them down in minutes so why keep something installed if I’m not going to use it?

      • hperrin@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        I remember thinking, “How am I ever going to fill this 100MB hard drive? That’s so much space!” That was some time around 1997, I think.

  • nowwhatnapster@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    8 months ago

    Altice (Optimum) took this opportunity to cut upload speeds from 35mbps to 20 under the guise of the “free upgrade”. You want your old upload speeds back? Oh that’s their most expensive tier now.

    • Avg@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      8 months ago

      I’m dropping them, it was too unreliable for work from home. I pay twice as much now for fios

    • theparadox@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      Same for my “XFinity” (Comcast) service. Literally the only plan with more than 20 up is the most expensive tier with 1200/35. Sadly, it has been that way for several years… but this year they had no choice but to jack up all rates across the board so the most expensive tier is now $30 more expensive ($90 -> $120). No other competition so… that’s that.

  • FuryMaker@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    8 months ago

    I care more for stability and low latency, not so much speed.

    Offering me a faster cellular or satellite connections doesn’t interest me.

    • n3m37h@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      17
      ·
      8 months ago

      I went from a 1.5/1 Gbps fibre connection down to a 20/10 Mbps when I moved. There is a MASSIVE difference. Rural internet is dog shit and no one cares

      • ctkatz@lemmy.ml
        link
        fedilink
        arrow-up
        13
        ·
        8 months ago

        I honestly believe that is because rural areas are almost always represented by republicans, voted in by majority republican voters. both groups of which are extremely disinclined of making the entirety of human knowledge easily and quickly accessible, because then people might see how much things are better in other countries and start asking questions to their federal representatives.

        • gamermanh@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          They also fall prey to the classic “only one Internet provider” shit because of the whole “whoever pays to have the lines in owns those lines forevermore” shit we have here

          It cost Comcast 10k to run a new line half a block to a place I lived 6 years ago, and that was in a rather empty part of my town.

          Imagine how much it costs to run lines M I L E S to rural people’s homes. Who’s even going to try setting up there when someone else already has done it?

          My area is controlled by Dems that are pretty lib, but thanks to how expensive it is to start an ISP we have literally 1 option for an almost 75 square mile area for non-sattelite Internet. Their max speed is 100 Mbps synch, and you have to fill out a PDF to get service (including putting s password for your account on said PDF, I put “fuck No im not” for mine for obvious reasons), and their techs will ignore service requests (they installed their stupid rental router and charged me monthly for it despite me saying not to) and lie (they said they couldn’t add my owned router to their list multiple times before someone finally took it’s fucking MAC address from me)

    • SuiXi3D@fedia.io
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      Your download speed being fast or slow doesn’t mean the servers hosting the data you’re accessing or the DNS servers between you and that server are going to feed you data at that speed.

      • michael_palmer@lemmy.sdf.org
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Game stores like Steam and GOG can provide download speeds of up to 1 Gbps or more, also torrents have no speed limits, it depends on the number of seeders

        • hperrin@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          Yeah, I regularly hit about 80MBps (640Mbps) from Steam. I’m pretty close to their San Diego servers, so I get the good pipes. If I was closer, I’d probably be able to hit gigabit speeds.

        • SuiXi3D@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          Yes, but normal websites might not. There’s no reason to if the amount of data being transferred is so small. Even large transfers, particularly streaming video providers, will have trouble feeding data to you at 1 Gbps simply because the network interface on the server might be saturated, the switch it’s connected to might have a slower CPU, the DNS server might be tossing your data into a queue or have a slower CPU itself. There are SO MANY hops between you and whatever data you’re trying to access, and every one of them influences the speed at which data will get to you. I’m not saying gigabit speeds aren’t worth paying for, but not everyone needs those speeds, especially if their ISP’s hardware isn’t up to snuff.

    • Takumidesh@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Your connection would not allow streaming one Blu-ray quality video stream, and good luck doing anything else in the connection while that is happening.

      If your work sent you a 10gb file and you needed to send it back, it would take you 3 hours to do that. (With a functionally useless connection otherwise while downloading and uploading the file)

      Downloading a popular game like baldurs gate 3 would take just under 9 hours.

      Downloading it twice (to play with your spouse or kids) + updates, and then watching Netflix (which will cut into your download speed) while you wait for it download would toil away a weekend.

      Nevermind the fact that slow Internet literally wastes away your life as you spend more micro moments just staring at blank and partially loaded websites.

  • CharlesDarwin@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    8 months ago

    I’d like to see a big government push to provide municipal services in every single metro area and extend it by whatever means into rural communities.

    Xfinity keeps raising rates, I’m paying more now for just internet than the cost of basic cable, internet + digital voice was back in the 00s. While around 800 down, it’s still only about 40 something up, and has been like that for years and years.

    I think we desperately need competition and if the government were to provide it, that’d be just fine.

    • hperrin@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Where are you? I’ve lived in California my whole life and have had faster speeds than that since 1998.

      • PatFusty@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        I was kidding. I get 900+ Mbps on my phone while I only get about 400 max on my desktop at home. I live north of San Diego

    • SomeOne
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      It really does suck, where I live the base plan gives you 300mbps down (which I know is pretty fast) but you are limited to 10mbps up. As much as they tout their speeds you’ll only get them if you pay top dollar.

      • MrMcGasion@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Sounds like Spectrum where I live, on the bright side our 300 down is usually closer to 350 down, but also their 10 up is usually closer to 8. Meanwhile you have to dig to find the upload speeds when you sign up, even though they have the download speeds plastered everywhere. Honestly, there should probably be a rule that ISPs can’t list download speeds without upload speeds right next to it.

        • SomeOne
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Yeah it is spectrum, the company is quite irritating and yeah they should be required to show both up and down speeds next to each other. For awhile I had t-mobile internet but the speeds were too inconstant so back to spectrum it was.