I have never liked Apple and lately even less. F… US monopolies

  • مهما طال الليل@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    21 hours ago

    My iPhone 13 mini will be my last iPhone. They lost me when it turned out they donate to illegal settlements in the West Bank. This is just more fuel to the fire.

  • Mwa@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    I love how Apple advertises “Privacy by default” but they do this

    • werefreeatlast@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      Nah, they’re just checking to see if you guys are well proportioned or maybe you need something they can sell you. For example, new clothes? Dave, isn’t that the same jacket you always wear to work? You need a new one! Here are some options from Walmart, we’ll just hide them here behind this thing you’re browsing about right now… How about here too! And here!

  • kipo@lemm.ee
    link
    fedilink
    English
    arrow-up
    30
    ·
    2 days ago

    An opt-out that you can’t opt out of because Apple already opted you in and took your photos?

    This seems like it is going to be a huge lawsuit. Since a class action won’t deter them or help us, let’s all sue Apple individually in small claims court and kill them by death from a billion cuts.

    • boonhet@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 day ago

      According to another comment, the photos never leave your device, that part of the processing is done on-device. The global index is on Apple servers.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      Not saying that it shouldn’t be illegal and it’s shady as fuck, but GDPR opt-outs are usually retroactive, meaning you can remove consent from data they’ve already processed, and they have to retroactively scrub your personal data out.

      • ZoopZeZoop@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        2 days ago

        How do they retroactively remove the knowledge the AI has gained from analyzing all of our personal information?

        • ℍ𝕂-𝟞𝟝@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 days ago

          They don’t need to.

          They only have to remove your personal data. So the company / AI model is not allowed to have data specifically on you, but it can have the average age of people living in your town even if your data contributed to calculating that average.

          That said, Apple here never had affirmative consent, so they can’t get away with just doing this.

      • kipo@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 days ago

        If they did this in Europe, I would argue it is a GDPR violation and it would be impossible for Apple to remove the data they collected. I hope the EU fines Apple out the nose for this.

  • VeganCheesecake
    link
    fedilink
    arrow-up
    10
    ·
    2 days ago

    I don’t even get it. Like, make a pop up with a short blurb explaining the feature. Most users will probably opt in, and you don’t piss off the ones that don’t want this.

  • dubyakay@lemmy.ml
    link
    fedilink
    arrow-up
    12
    ·
    2 days ago

    Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides [your] IP address. This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and go to Settings > General.

    Apple did explain the technology in a technical paper published on October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted. A local machine-learning model analyzes photos to look for a “region of interest” that may depict a landmark. If the AI model finds a likely match, it calculates a vector embedding – an array of numbers – representing that portion of the image.

    So it’s local. And encrypted. How is this really news? Am I missing something?

  • dustyData@lemmy.world
    link
    fedilink
    arrow-up
    197
    ·
    3 days ago

    Where’s the “Apple is the only tech giant that respects your privacy” crowd? Just because your data isn’t being publicly auctioned doesn’t mean they aren’t harvesting it and infringing on your privacy.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      No one thinks Apple, or any other ecosystem for that matter, is completely private. It’s just far more private than Android. Primarily because Apple is not an advertising company.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        Apple is always silent, because they know they have no justification for their bullshit.

    • deranger@sh.itjust.works
      link
      fedilink
      arrow-up
      28
      ·
      edit-2
      3 days ago

      It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.

      From the link:

      Put simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher.

      If it all works as claimed, and there are no side-channels or other leaks, Apple can’t see what’s in your photos, neither the image data nor the looked-up label.

      • ExtremeDullard@lemmy.sdf.org
        link
        fedilink
        arrow-up
        43
        ·
        edit-2
        3 days ago

        It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.

        What if I don’t want Apple looking at my photos in any way, shape or form?’

        I don’t want Apple exflitrating my photos.
        I don’t want Apple planting their robotic minion on my device to process my photos.
        I don’t want my OS doing stuff I didn’t tell it to do. Apple has no business analyzing any of my data.

        • utopiah@lemmy.ml
          link
          fedilink
          arrow-up
          5
          ·
          2 days ago

          I don’t want Apple exflitrating my photos.

          Well they don’t. I don’t want to justify the opt-in by default but, again (cf my reply history) here they are precisely trying NOT to send anything usable to their own server. They are sending data that can’t be used by anything else but your phone. That’s the entire point of homomorphic encryption, even the server they are sent to do NOT see it as the original data. They can only do some kind of computations to it and they can’t “revert” back to the original.

          • ExtremeDullard@lemmy.sdf.org
            link
            fedilink
            arrow-up
            3
            ·
            2 days ago

            If they don’t look at my data, they don’t even have to encrypt it.
            If they don’t try to look at my data, they don’t need to wonder whether they should ask my permission.

            I don’t want Apple or anybody else looking at my data, for any reason, is my point.

            • utopiah@lemmy.ml
              link
              fedilink
              arrow-up
              3
              ·
              2 days ago

              I agree on permission.

              Yet I’ll still try to clarify the technical aspect because I find that genuinely interesting and actually positive. The point of homomorphic encryption is that they are NOT looking at your data. They are not encrypting data to decrypt them. An analogy would be that :

              • we are a dozen of friends around a table,
              • we each have 5 cards hidden from others,
              • we photocopy 1 card in secret
              • we shred the copied card, remove half of it, put it in a cup and write a random long number on that cup
              • we place that cup in a covered bowl
              • one of us randomly picked gets to pick a cup, count how many red shards are in it, write it back in the cup and writes adds the number to the total written on the bowl, we repeat that process until all cups are written on only once
              • once that’s done we each pick back our up without showing it to the others

              Thanks to that process we know both something about our card (the number of red shards) and all other cards (total number of red shards on the bowl) without having actually revealed what our card is. We have done so without sharing our data (the uncut original card) and it’s not possible to know its content, even if somebody were to take all cups.

              So… that’s roughly how homomorphic encryption works. It’s honestly fascinating and important IMHO, the same way that cryptography and its foundation, e.g. one way functions or computational complexity more broadly, are basically the basis for privacy online today.

              You don’t have to agree with how Apple implemented but I’d argue understanding how it works and when it can be used is important.

              Let me know if it makes sense, it’s the first time I tried to make an analogy for it.

              PS: if someone working on HE has a better analogy or spot incorrect parts, please do share.

              • ExtremeDullard@lemmy.sdf.org
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                1 day ago

                It makes sense, but you totally miss my point. To go with your analogy, my point is:

                • I’m not interested in playing cards

                That’s it.

                I don’t care how fascinating the technology is and how clever Apple are: they are not welcome to implement it on my device. I didn’t invite them to setup a card game and I expect them not to break into my house to setup a table.

                • utopiah@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  edit-2
                  1 day ago

                  they are not welcome to implement it on my device

                  I wish, sadly that’s not how using non open source or open hardware devices work. You are running their software on their hardware with their limitations. It’s not a PC or SBC.

                  Edit: if we were to stick to the card game analogy, it’d be more like playing the card game in a hotel, in a room that you rented, rather than at home.

        • ganymede@lemmy.ml
          link
          fedilink
          arrow-up
          13
          ·
          edit-2
          3 days ago

          TLDR edit: I’m supporting the above comment - ie. i do not support apple’s actions in this case.


          It’s definitely good for people to learn a bit about homomorphic computing, and let’s give some credit to apple for investing in this area of technology.

          That said:

          1. Encryption in the majority of cases doesn’t actually buy absolute privacy or security, it buys time - see NIST’s criteria of ≥30 years for AES. It will almost certainly be crackable <oneday> either by weakening or other advances… How many people are truly able to give genuine informed consent in that context?

          2. Encrypting something doesn’t always work out as planned, see example:

          “DON’T WORRY BRO, ITS TOTALLY SAFE, IT’S ENCRYPTED!!”

          Source

          Yes Apple is surely capable enough to avoid simple, documented, mistakes such as above, but it’s also quite likely some mistake will be made. And we note, apple are also extremely likely capable of engineering leaks and concealing it or making it appear accidental (or even if truly accidental, leveraging it later on).

          Whether they’d take the risk, whether their (un)official internal policy would support or reject that is ofc for the realm of speculation.

          That they’d have the technical capability to do so isn’t at all unlikely. Same goes for a capable entity with access to apple infrastructure.

          1. The fact they’ve chosen to act questionably regarding user’s ability to meaningfully consent, or even consent at all(!), suggests there may be some issues with assuming good faith on their part.
          • ExtremeDullard@lemmy.sdf.org
            link
            fedilink
            arrow-up
            14
            ·
            edit-2
            3 days ago

            How hard is it to grasp that I don’t want Apple doing anything in my cellphone I didn’t explicitely consent to?

            I don’t care what technology they develop, or whether they’re capable of applying it correctly: the point is, I don’t want it on my phone in the first place, anymore than I want them to setup camp in my living room to take notes on what I’m doing in my house.

            My phone, my property, and Apple - or anybody else - is not welcome on my property.

            • ganymede@lemmy.ml
              link
              fedilink
              arrow-up
              11
              ·
              edit-2
              3 days ago

              Sorry for my poor phrasing, perhaps re-read my post? i’m entirely supporting your argument. Perhaps your main point aligns most with my #3? It could be argued they’ve already begun from a position of probable bad faith by taking this data from users in the first place.

      • Ebby@lemmy.ssba.com
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        3 days ago

        Wait, what?

        So you take a pic, it’s analysed, the analysis is encrypted, encrypted data is sent to a server that can deconstruct encrypted data to match known elements in a database, and return a result, encrypted, back to you?

        Doesn’t this sort of bypass the whole point of encryption in the first place?

        Edit: Wow! Thanks everyone for the responses. I’ve found a new rabbit hole to explore!

        • BorgDrone@lemmy.one
          link
          fedilink
          arrow-up
          16
          ·
          3 days ago

          Doesn’t this sort of bypass the whole point of encryption in the first place?

          No, homomorphic encryption allows a 3rd party to perform operations on encrypted data without decrypting it. The resulting answer is in encrypted form and can only be decrypted by whoever has the key.

          Extremely oversimplified example:

          Say you have a service that converts dollar amounts to euros using the latest exchange rate. You send the amount in dollars, it multiplies by the exchange rate and then returns the euro amount.

          Now, let’s assume the clients of this service do not want to disclose the amounts they are converting. What they could do is pick a large random number and multiply the amount by this number. The conversion service multiplies this by the exchange rate and returns the ridiculously large number back. Then you divide thet number by the random number you picked and you have converted dollars to euros without the service ever knowing the actual amount.

          Of course the reality is much more complicated than that but the idea is the same: you can perform operations on data in its encrypted form and now know what the data is nor the decrypted result of the operation.

        • utopiah@lemmy.ml
          link
          fedilink
          arrow-up
          17
          ·
          3 days ago

          So homomorphic encryption means the server can compute on the data without actually knowing what’s in it. It’s counter-intuitive but better not think about it as encryption/decryption/encryption precisely because the data is NOT decrypted on the server. It’s sent there, computed on, then a result is sent back.

          • kipo@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            Wait, it’s called homomorphic encryption? All we’d have to do is tell MAGAs that Tim Apple just started using homomorphic encryption with all the iphones and the homophobic backlash would cause Apple to walk this back within a week.

            I’m only half joking.

          • someacnt@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 days ago

            It might still be possible to compare ciphertexts and extract information from there, right? Welp I am not sure if the whole scheme is secure against related attacks.

            • utopiah@lemmy.ml
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              3 days ago

              extract information

              I don’t think so, at least assuming the scheme isn’t actually broken… but then arguably that would also have far reaching consequence for encryption more broadly, depending on what scheme the implementation would be relying on.

              The whole point is precisely that one can compute without “leaks”.

              Edit: they are relying on Brakerski-Fan-Vercauteren (BFV) HE scheme, cf https://machinelearning.apple.com/research/homomorphic-encryption

        • deranger@sh.itjust.works
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          3 days ago

          I’m not pretending to understand how homomorphic encryption works or how it fits into this system, but here’s something from the article.

          With some server optimization metadata and the help of Apple’s private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare.

          There’s a more technical write up here. It appears the final match is happening on device, not on the server.

          The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate by using high-level multimodal feature descriptors, including visual similarity scores; locally stored geo-signals; popularity; and index coverage of landmarks (to debias candidate overweighting). When the model has identified the match, the photo’s local metadata is updated with the landmark label, and the user can easily find the photo when searching their device for the landmark’s name.

          • 31337@sh.itjust.works
            link
            fedilink
            arrow-up
            5
            ·
            edit-2
            3 days ago

            That’s really cool (not the auto opt-in thing). If I understand correctly, that system looks like it offers pretty strong theoretical privacy guarantees (assuming their closed-source client software works as they say, with sending fake queries and all that for differential privacy). If the backend doesn’t work like they say, they could infer what landmark is in an image when finding the approximate minimum distance to embeddings in their DB, but with the fake queries they can’t be sure which one is real. They can’t see the actual image either way as long as the “128-bit post-quantum” encryption algorithm doesn’t have any vulnerabilies (and the closed source software works as described).

          • rtxn@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 days ago

            by using high-level multimodal feature descriptors, including visual similarity scores; locally stored geo-signals; popularity; and index coverage of landmarks (to debias candidate overweighting)

            …and other sciencey-sounding technobabble that would make Geordi LaForge blush. Better reverse the polarity before the dilithium crystals fall out of alignment!

              • rtxn@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                3 days ago

                That’s the point. It’s a list of words that may or may not mean something and I can’t make an assessment on whether or not it’s bullshit. It’s coming from Apple, though, and it’s about privacy, which is not good for credibility.

                • datavoid@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  ·
                  3 days ago

                  I don’t know what a geo-signal is, but everything else listed there makes perfect sense given the context.

    • NauticalNoodle@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      3 days ago

      I heard that they were the first test-audience Apple used to test their new product, the IRope. Apple designed it to go around their user’s necks. The other end of the IRope is designed to attach to a proprietary cryptographic dongle to work called the Lynch-Key. Apple says it’s like a lynch-pin because it’s critical to the function the IRope.

      Apple never did hear back from the test-audience. -I think this product will be a real winner!

    • TrickDacy@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      3 days ago

      Oh they’re here, just seething about this and their precious green texts or whatever the fuck else false sense of security they’ve been clinging to

  • Whats_your_reasoning@lemmy.world
    link
    fedilink
    arrow-up
    120
    ·
    3 days ago

    In case anyone came to the comments looking for directions on how to opt out:

    1. Go to Settings. 2) Scroll down and select “Photos.” 3) Locate the “Enhanced Visual Search” option. Turn off the toggle.
    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Tim Apple was going to kick $1M to whomever won. For a guy with a net worth in the tens of billions, this is just a tip to the wait staff at the Table Of Success.

      But the Apple photo library is a huge potential source of revenue. Its worth significantly more than $1M. This is, incidentally, why you don’t need to pay Apple to host those images. If you’re not the client, you’re the product.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        This is, incidentally, why you don’t need to pay Apple to host those images

        Huh? You pay for anything above 5 GB or so. It’s standard for most cloud providers to offer a free tier to get you hooked. Their storage after that isn’t all that cheap even.

  • Grandwolf319@sh.itjust.works
    link
    fedilink
    arrow-up
    17
    ·
    edit-2
    2 days ago

    Although this is terrible, once again a headline on lemmy made me paranoid only to find out that my phone probably doesn’t even support this.

    Going through the settings and turning things off is second nature to me by now, it’s not unique to Apple (looking at your Microsoft).

    What we need is an opt out mode on every device. Similar to the accept necessary cookies only, we need every device to let you fully opt out from everything it can when you boot it up for the first time.

    • Cataphract@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      I don’t think it’s fair to say “once again a headline on lemmy”, by connotation you’re vaguely suggesting Lemmy is responsible.

      I’m a big settings person as well but honestly Apple is a fucking evil genius at hiding options in menus within menus. Plus this was an opt-change done randomly in the middle of “nobody knows”, I don’t check all of my settings and their subsequent menus daily for any changes being made.

      I’m just flabbergasted by the whole apple industry though. Like it’s obvious when a company wants to offer a new user experience (their newest innovative design!), and it’s obvious when a company wants to only tailor to “Their preferred vision of what an apple user and their experience should be”. No one asked for this shit, and it’s being shoved down everyone’s throats.

      I guess I shouldn’t be surprised. I still watch people walk into a Dollar General knowing how crappy that company acts and how much more costly everything is. We’re all slowly being pigeon holed into a “unified user experience” and it’s the shittiest outcome.

  • jqubed@lemmy.world
    link
    fedilink
    arrow-up
    55
    ·
    3 days ago

    “Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don’t think the company is living up to its ideals here,” observed software developer Michael Tsai in an analysis shared Wednesday. “Not only is it not opt-in, but you can’t effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve already opted out of uploading your photos to iCloud.”

    Reading the article, the service itself is interesting and it sounds like Apple might have found a way to process the data while preserving user privacy, but the fact that they unilaterally opted everyone in without giving them a choice is the biggest problem.

    • fmstrat@lemmy.nowsci.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      Agreed, I’m not an Apple fan, but the headline is vague enough to make things seem worse than they are.