• WiildFiire@lemmy.world
      link
      fedilink
      arrow-up
      29
      ·
      1 year ago

      It’ll be kept within product marketing and, I dunno how, but it would absolutely be used to see what they can raise prices on

    • CeeBee@lemmy.world
      link
      fedilink
      arrow-up
      40
      ·
      1 year ago

      It’s getting there. In the next few years as hardware gets better and models get more efficient we’ll be able to run these systems entirely locally.

      I’m already doing it, but I have some higher end hardware.

        • CeeBee@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          Stable diffusion SXDL Turbo model running in Automatic1111 for image generation.

          Ollama with Ollama-webui for an LLM. I like the Solar:7b model. It’s lightweight, fast, and gives really good results.

          I have some beefy hardware that I run it on, but it’s not necessary to have.

        • Ookami38@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Depends on what AI you’re looking for. I don’t know of an LLM (a language model,think chatgpt) that works decently on personal hardware, but I also haven’t really looked. For art generation though, look up automatic1111 installation instructions for stable diffusion. If you have a decent GPU (I was running it on a 1060 slowly til I upgraded) it’s a simple enough process to get started, there’s tons of info online about it, and it’s all run on local hardware.

          • CeeBee@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I don’t know of an LLM that works decently on personal hardware

            Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.

      • anti-idpol action@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yeah if your willing to carry a brick or at least a power bank (brick) if you don’t want it to constantly overheat or deal with 2-3 hours of battery life. There’s only so much copper can take and there are limits to minaturization.

        • arthurpizza@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          It’s not like that though. Newer phones are going to have dedicated hardware for processing neural platforms, LLMs, and other generative tools. The dedicated hardware will make these processes just barely sip the battery life.

          • MenacingPerson@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            wrong.

            if that existed, all those AI server farms wouldn’t be so necessary, would they?

            dedicated hardware for that already exists, it definitely isn’t gonna be able to fit a sizeable model on a phone any time soon. models themselves require multiple tens of gigabytes of storage space. you won’t be able to fit more than a handful on even a 512gb internal storage. the phones can’t hit the ram required for these models at all. and the dedicated hardware still requires a lot more power than a tiny phone battery.

            • arthurpizza@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Those server farms are because the needs of corporations might just be different from the needs of regular users.

              I’m running a 8 GB LLM model locally on my PC that performs better than 16 GB models from just a few months ago.

              It’s almost as if technology can get better and more efficient over time.

    • aubertlone@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Hey me too.

      And I do have a couple different LLMs installed on my rig. But having that resource running locally is years and years away from being remotely performant.

      On the bright side there are many open source llms, and it seems like there’s more everyday.

  • MiDaBa@lemmy.ml
    link
    fedilink
    arrow-up
    90
    ·
    1 year ago

    The bad news is the AI they’ll pay for will instead estimate your net worth and the highest price you’re likely to pay. They’ll then dynamicly change the price of things like groceries to make sure the price they’re charging will maximize their profits on any given day. That’s the AI you’re going to get.

  • theneverfox@pawb.social
    link
    fedilink
    English
    arrow-up
    49
    ·
    1 year ago

    AI could do this. Conventional programming could do it faster and better, even if it was written by AI.

    It’s an important concept to grasp

    • theblueredditrefugee@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      Cameras in your fridge and pantry to keep tabs on what you have, computer vision to take inventory, clustering to figure out which goods can be interchanged with which, language modeling applied to a web crawler to identify the best deals, and then some conventional code to aggregate the results into a shopping list

      Unless you’re assuming that you’re gonna be supplied APIs to all the grocery stores which have an incentive to prevent this sort of thing from happening, and also assuming that the end user is willing, able, and reliable enough to scan every barcode of everything they buy

      This app basically depends on all the best ai we already have except for image generation

      • lightnsfw@reddthat.com
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        Cameras and computer vision aren’t necessary. Food products already come with upcs. All you need is a barcode reader to input stuff and to track what you use in meals. Tracking what you use could also be used for meal planning.

        • theblueredditrefugee@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Yeah, I did think of the barcode approach, but I didn’t think anyone would be willing to scan every item, which is why I ignored it

          However, revisiting this question made me realize that we could probably have the user scan receipts. It would take some doing but you could probably extract all the information from the receipt because it’s in a fairly predictable format, and it’s far less onerous.

          OTOH, you still have to scan barcodes every time you cook with something, and you’d probably want some sort of mechanism to track partial consumption and leftovers, though a minimum viable product could work without that

          The tough part, then, is scouring the internet for deals. Should be doable though.

          Might try to slap something together tonight or tomorrow for that first bit, seems pretty easy, I bet you’ve got open source libraries for handling barcodes, and scanning receipts can probably just be done with existing OCR tech, error correction using minimum edit distance, and a few if statements to figure out which is the quantity and which is the item. That is, if my adhd doesn’t cause me to forget

          • lightnsfw@reddthat.com
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            OTOH, you still have to scan barcodes every time you cook with something, and you’d probably want some sort of mechanism to track partial consumption and leftovers, though a minimum viable product could work without that

            If you can also keep recipes in the system you could skip scanning the barcodes here. You’d just need to input how many servings you prepared and any waste. Even if the “recipe” is just “hot pocket” or something. If the system knows how much is in a package it can deduct what you use from the total and add it to the list when you need more.

      • Vegaprime@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Rolling this out for tools and parts at my work. Tool boxes with cameras in the drawers to make sure you put it back. Vending machines for parts with auto order.

      • Komatik@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I think you can achieve a similar result by having one giant DB so we can average out general consumption and then have a personal/family profile, where we in the first place manually feed the AI with data like, what did we bought, exp date, when did we partly or fully consume it. Although intensive at first I think AI will increasingly become more accurate whereby you will need to input less and less data as the data will be comming from both you and the rest of the users. The only thing that still needs to be input is “did you replace it ?”

        This way we don’t need cameras

          • Komatik@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Sure no problem, I just need you to puch in some data manually so we can get started. Can you get thid stack done by tomorrow? Awesome, see you tomorrow!

        • theblueredditrefugee@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Oh, so you’re saying that the only data the algorithm needs in the limit is whether or not the user deviated from the generated shopping list, and if so, how, right?

          This is true, it’s just a bit difficult to cross the gap from here to there

  • AeonFelis@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    1 year ago

    I’m sure there are companies who’d love to develop something like this. And collect that information about exactly what groceries you currently have and statistics of how you consume them, so they can sell it to advertisers. Not advertisers that sell these groceries, of course - for these the AI company could just make the AI buy them from suppliers that pay them.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      They already exist and have been doing this for a long time, they are just using dumber versions of deep learning than what we have right now.

      Less about giving your personal information to an advertiser though and more about using aggregate data trends to guide marketing efforts.

      Like if you know buns and hotdogs sell like crazy the week before July 4th merchandizing bundles of both that override brand purchase intent on favor of convenience and discount.

      An example of this kind of market research in action would be a clothes store that knows 20% of its sales were to people who shopped the day before they came back to buy offering 48hr exit coupons that would be valid the next day for a limited time.

      The personalized data is used in house at these aggregators to market to you directly, such as the war and peace length personalized coupons on receipts where they’ve been contracted by the retailers.

    • danciestlobster@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Not just advertisers, it would also get sold to food manufacturers and product developers. This is not so bad though cause it helps new products come out that might be in line with what you want

    • TheSanSabaSongbird@lemdro.id
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      This is what “loyalty” cards are for. They give you a little discount in exchange for being able to track your purchases.

  • Professorozone@lemmy.world
    link
    fedilink
    arrow-up
    41
    ·
    1 year ago

    I want AI to control traffic lights so that I don’t sit stopped through an entire cycle as the only car in a 1 mile radius. Also, if there is just one more car in line, let the light stay green just a couple seconds longer. Imagine the gas and time that could be saved… and frustration.

          • IndefiniteBen@leminal.space
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            How do I make it sound like that? You first need to build traffic light and road infrastructure that can handle advanced traffic flow, along with the processing power to make decisions based on sensor readings and rules.

            The software (AI is kinda overkill) exists to handle and optimise traffic flow over an entire city, but your software does not matter if there are insufficient sensors for the software to make decisions, or too few controllable lights to implement decisions (or both).

          • NaoPb@eviltoast.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            What they’re saying is if money was adequately invested in infrastructure, these old systems would have been upgraded 10 or 20 years ago and AI would not be necessary at all.

      • LemmyKnowsBest@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Thank goodness. until every intersection becomes this intuitive, I will only continually notice the ones that hold me hostage through several cycles and /or don’t even notice I’m there waiting at a red light for 5 minutes at 3am when I’m the only car there.

    • Grass@sh.itjust.works
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Doesn’t need AI, and there are countries that already have a system in place with the same result. Unsurprisingly the places with more focus on pedestrian, cyclist, and public transit infrastructure have the most enjoyable driving experience. All the people that don’t want to drive will stop as soon as it is safe and convenient, and all those cars off the road also help with this because the lights will be queued up with fewer cars.

      • Professorozone@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Unfortunately, the US is king of the suburbs and I don’t see that changing any time soon.

        I know you don’t need AI to do this but I think AI would do a great job if properly employed.

        • Wogi@lemmy.world
          link
          fedilink
          arrow-up
          18
          ·
          1 year ago

          It’s not that I can’t fathom how it could be better, I literally wish I could get rid of my car.

          I literally can’t. I live far enough away from my job that not having a car to get there every day isn’t an option, I can’t move close enough to my job to eliminate a car, and even if I did, I’m only making the drive further for my wife. We don’t live within walking distance of a grocery store. I genuinely need a car. My wife needs one too. I don’t live in a city with with even shitty options for public transit. It’s just not an option. My wife doesn’t work in the same city she works in, there is no bus, and the nearest bus stop to my job is a 45 minute walk from my job, and a 2 hour bus trip.

          It’s a 10 minute drive for both of us.

          If I could sell my car I fucking would. I love my car, but I’d give it up in a heartbeat if it were an option. I just don’t have the option. This is without children. When a child is thrown in to the mix we will only depend on having two cars more.

          Our mothers are aging, they live here and don’t have other support. She has licenses that lock her in to this state. We aren’t moving, and this city is a car city.

      • Professorozone@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        That would be great, but it’s just not practical in many places.

        I looked up how to get to work using public transportation once. It was 3 hours using 3 busses and a half hour walk. LOL. I could literally do it in two hours using a bike. But I’m just not willing to spend 4 hours a day getting to work and back. I don’t know many that would of they had a choice. It’s half an hour drive for me, but 22 miles, mostly interstate.

    • NaoPb@eviltoast.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      To be fair, there are already more intelligent traffic light systems that use sensors in the road to see if there is traffic sitting at the lights, combined with push buttons for pedestrians and cyclists. These can be combined with sensors further up the road to see if more traffic is coming and extend the periods of green light for certain sides. It may not be perfect and it may require touching up later after seeing which times could be extended or shortened. It’s not AI but it works a lot better than the old hard coded traffic lights. Here in the Netherlands there are only a handfull of intersections left that still have the hard coded traffic lights.

      • Professorozone@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Not near me. I can’t speak to the entire US, but everywhere I’ve been, it’s horrible. In Germany they have a green wave, where all of the lights are green if you go the speed limit. I have only encountered this twice within 200 miles of where I live.

    • LemmyKnowsBest@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      You and Sarah Radz and everyone else here with brilliant practical ideas need to submit your resumes to the Silicon-Valley-esque corporations that comandeer such industries, be hired on as brains.

      • Professorozone@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Thank you for the kindness. At least I think it’s kind. I don’t know who Sarah Radz is. So I choose to accept this as a compliment.

  • eclectic_electron@sh.itjust.works
    link
    fedilink
    arrow-up
    39
    ·
    1 year ago

    This is surprisingly difficult problem because different people are okay with different brand substitutions. Some people may want the cheapest butter regardless of brand, while others may only buy brand name.

    For example my wife is okay with generic chex from some grocery stores but not others, but only likes brand names Cheerios. Walmart, Aldi, and Meijer generic cheese is interchangable, but brand name and Kroger brand cheese isn’t acceptable.

    Making a software system that can deal with all this is really hard. AI is probably the best bet, but it needs to be able to handle all this complexity to be useable, which is a lot of up front work

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      As long as the AI has access to their ongoing purchase histories it’s actually quite easy to have this for day to day situations.

      Where it would have difficulty is unexpected spikes in grocery usage, such as hosting a non-annual party.

      In theory, as long as it was fine tuned on aggregate histories it should be decent at identifying spikes (i.e. this person purchased 10x the normal amount of perishables this week, that typically is an outlier and they’ll be back to 1x next week), but anticipating the spikes ahead of time is pretty much impossible.

      • Seasoned_Greetings@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Both of these problems could feasibly be solved by user input. If you had the ability to set rules for your personal experience, problems like that would only last as long as it takes the user to manually correct.

        Like, “Ai, I bought groceries for a party on March 5th. Don’t use that bill to predict what I need” or “stop recommending butter that isn’t this specific brand”

    • Prophet@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Also quite difficult from a vision perspective. Tons of potential object classes, objects with no class (e.g., leftovers, homemade things), potential obfuscation if you are monitoring the refrigerator/cabinets. If the object is in a container, how do you measure the volume remaining of that substance? This is just scratching the surface I imagine. These problems individually are maybe not crazy challenging but they are quite hard all together.

      • kromem@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        You don’t use vision, or if using it you are only supplementing a model that is mostly using purchase histories as the guiding factor.

        • TheGreenGolem@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          But you actually need vision because purchase history is not indicative of my future purchases. Sometimes I buy butter and eat it in a 3 days and buy again. Sometimes I’m not in the mood and have a chunk of butter to sit in my fridge for 3 weeks. It’s honestly totally random for a lot of things. It depends only on my mood at the moment.

          • kromem@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            You’d be surprised at how many of those things you think are random would actually emerge as a pattern in long enough purchase history data.

            For example, it might be that there’s a seasonality to your being in the mood. Or other things you’d have brought a week before, etc.

            Over a decade ago a model looking only at purchase history for Target was able to tell a teenage girl was pregnant before her family knew just by things like switching from scented candles to unscented.

            There’s more modeled in that data than simply what’s on the receipt.

        • Prophet@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I agree, in the context of the tweet, that purchase history is enough to build a working product that roughly meets user requirements (at least in terms of predicting consumed items). This assumes you can find enough purchase history for a given user. Even then, I have doubts about how robust such a strategy is. The sparsity in your dataset for certain items means you will either a.) be forced to remove those items from your prediction service or b.) frustrate your users with heavy prediction bias. Some items also simply won’t work in this system - maybe the user only eats hotdogs in the summer. Maybe they only buy eggs with brownie mix. There will be many dependencies you are required to model to get a system like this working, and I don’t believe there is any single model powerful enough to do this by itself. Directly quantifying the user’s pantry via vision seems easy in comparison.

      • Bread@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        There could be an easy party mode button in which it just ignores the usual and picks likely food options for a party.

      • eclectic_electron@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Honestly I would be perfectly happy with the service like this, even if I had to manually input what groceries I need. It’s still an incredibly complex problem though. AI is probably better suited for it than anything else since you can have iterative conversations with latest generation AIs. That is, if I tell it I need cereal, it looks at my purchase history and guesses what type of cereal I want this week, and adds it to my list, I can then tell it no, actually I want shredded mini wheats.

        So it would probably have to be a combination of a very large database and information gathering system with a predictive engine and a large language model as the user interface.

  • Baines@lemmy.world
    link
    fedilink
    arrow-up
    38
    ·
    1 year ago

    google used to do this type of stuff then you get SEO shit and in the same way people would try to game the system and ruin it

    • psmgx@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Aye, this be the problem. As long as there is a profit motive the AI is going to steer you to whatever makes them money, be it whoever works the SEO game or pays for API access.

  • Snapz@lemmy.world
    link
    fedilink
    arrow-up
    38
    ·
    1 year ago

    We were already robbed of the brief value stage of AI, it came out of the gate with a corporate handler and a ™

    The internet had a stretch where it was just useful, available and exciting. This does not.

    • danielbln@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      Local models are a thing, and GPT is extremely useful in some cases, even with the corporate handholding. I find the whole space super exciting, personally.

      • Snapz@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        The accessibility of local models is nowhere near what the early web was. We could ALL have a geocities website and our own goofy “corner of the internet” without the extra bullshit.

        • sighofannoyance@lemmy.world
          cake
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          be the person that makes local ai’s the new geo cities. Make the tech accessible bro! I will invest in your crowdfund

          .

          Ø.

          . p

          after due dilligence only… that is

    • jivemasta@reddthat.com
      link
      fedilink
      arrow-up
      24
      ·
      1 year ago

      I mean, when that xkcd was made, that was a hard task. Now identifying a bird in a picture can be done in realtime on a raspberry pi in a weekend project.

      The problem in the op isn’t really a limitation of AI, it’s coming up with an inventory management system that can detect low inventory without being obtrusive in a users life. The rest is just scraping local stores prices and compiling a list with some annealing algo that gets the best price to stops ratio.

      • IndefiniteBen@leminal.space
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        I think you focused too much on the details…

        AI image manipulation is entirely based in a computer where an image is processed by an algorithm. Grocerybot involves many different systems and crosses the boundary between digital and physical. The intertwined nature of the complexity is what makes it (relatively) difficult to explain.

    • Alexstarfire@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      No. That’s just what they wanted you to believe. All they really did was find a way to separate people from more money.

      I found out two people in my family bought smart fridges and both listed watching tv and listening to music as reasons for purchase. Not the only ones mind you, but some of the first ones mentioned. I don’t get it.

    • Billygoat@catata.fish
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      Exactly. But also I’m blown away that most grocery stores don’t list inventory and prices on the website. I can only think this is because they don’t want to show prices in an attempt to get you to go to the store.

      • fsxylo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I mean… Yeah? Grocery stores want you in the store. If they didn’t they’d be shipping only warehouses.

        • Billygoat@catata.fish
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I just mean that they must have done some research that says it is more profitable to only list a few prices instead of everything.

      • vivadanang@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        But also I’m blown away that most grocery stores don’t list inventory and prices on the website. I can only think this is because they don’t want to show prices in an attempt to get you to go to the store.

        yuuup.

  • fl42v@lemmy.ml
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    It’s not exactly an ai-task, I guess? Like pretty much the only ai-related thing there is to classify stuff in ocr-ed receipts (technically, one can opencv whatever is in the fridge, but I suspect it won’t be reliable enough).

    • Stamets@lemmy.worldOP
      link
      fedilink
      arrow-up
      30
      ·
      1 year ago

      Bruh. If AI is being taught to drive cars on the open road then I feel like cameras to detect what’s in your fridge is pathetically easy in comparison and very much an AI task

      • ricecake@sh.itjust.works
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        That’s how you get weird things like the AI determining that your favorite items are jam, baking soda and whatever you left at the back of your fridge to rot for six months.

        It is easy to detect what’s in your fridge. We have that today on some smart fridges.

        The problem to be solved though is

        • what’s in your fridge
        • what’s not in your fridge
        • what do you consume vs throw away
        • what do you buy
        • where do you shop
        • what prices are available
        • what’s the best way to minimize cost and store trips
        • what’s your metric for how to balance that

        Of those things, AI is really only helpful for determining the metric for how much money you need to save to add another grocery stop, and knowing that the orange blob is probably baking soda.

        Most of the rest of that is manual inputs or relatively basic but tedious programming, and those are the parts that would be the most annoying.
        I say this as a person who has repeatedly utterly failed to use https://grocy.info/ because actually recording what you eat vs throw away is painful.

        This isn’t a great AI problem not because AI can’t help, but because the tedious part isn’t the part it can help with right now.

        • decisivelyhoodnoises@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Yeah its not even remotely possible for someone to manually input that they eat 2 slices of cheese and 20grams of butter and 20 grams of jam every time they do so. And it is not feasible for AI to see inside closed packages or jars how much is eaten.

      • IronicDeadPan@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 year ago

        Probably would make sense to start with the receipts for what you purchase and aggregate lists from there (pantry, freezer, fridge, etc.).

      • fl42v@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Yeah, kinda. Except you’ll likely need a camera or two for each shelf of the fridge (given the layout remains unchanged), and also you have to make sure they don’t get covered with ice/spilled milk/whatever or blocked by a box of some stuff. Aaaalternatively, you install a receipt scanner and touch scrreen which asks you what you took and updates an internal db accordingly.

        • Stamets@lemmy.worldOP
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          No, not even kinda. Fully. Amazon has stores you can walk in and take whatever you want off the shelf and leave. If you put it back somewhere else, even if not on the same shelf, it can still track that.

          A fridge is a joke.

          • Eccentric@sh.itjust.works
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            I actually work in this field and it’s a lot more complicated than it sounds. When you’re training AI to recognize products in a store, you have a set list of products it needs to be trained on. A person might go to many different stores which increases the possible variation of products exponentially. Amazon’s model is also much more complex than just cameras, involving weight sensors in shelving, pressure detection, facial recognition. A store where everything is laid out in predictable, well lit, organized rows is already a nightmare. A fridge, even if it’s way smaller, is way, way less predictable

          • ImpossibilityBox@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            A typical Amazon store that I’ve been to is around 12,000—16,000 feet. A refrigerator is approx 20-25 cubic feet of real estate.

            Miniaturization of any system is always going to be a massive hurdle.

            Amazon uses biometric recognition to determine if a person has picked up something, RFID tags, Weight Sensors, cameras, Laser gates and probably some other things they aren’t telling us about.

            They also know a specific list of the items in the store and have 3d models for where each item is. nothing unexpected.

            For the fridge to work it would need to know every product ever made and have accurate and reliable scans of the existing product. Sure it might be able to find SOME of the same type of item but it will only work once it can find the EXACT item that I want everytime.

            Good luck finding my favorite brand of Guachujung that can’t be purchased online and is only available from a shady mom and pop grocery in Asia town.

            LASTLY… what’s a camera going to do with this:

      • CeeBee@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        then I feel like cameras to detect what’s in your fridge is pathetically easy in comparison

        But you’re skipping over a huge amount of context that’s missing. It’s context we (as humans) take for granted. What’s the difference between a jar and a bottle? Is the cream cheese in a tub or in a little cardboard container? Then it would need to be able to see all items in a fridge, know the expiration dates for each thing, know what you want to get, how quickly something gets used, etc.

        Some of those things are more straightforward, and some of them need data well beyond “this container has milk”. The issue isn’t processing all the data, but acquiring it consistently and reliably. We humans are very chaotic with how we do stuff in the physical world. Even the most organized person would throw off an AI system every so often. It’s the reason self driving cars are not a reality yet and won’t be for a while.

      • Ephera@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        The problem is that “AI” is a completely ill-defined term. The commenter above used the definition of it just being a more complex program and then they argued that you don’t need a more complex program. That’s as good of a definition as any other.

      • fl42v@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        By “ai tasks” I mean smth where ai is actually useful, such as object/pattern recognition, object classification, making predictions based on past data, etc. Can one train an ai to predict they need to buy onions when they have less than X in their fridge? Yap. Can one do the same with an if statement and prevent themselves from running into issues when ambient temperature on Mars rises? Also, yes.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          An AI task would be literally anything impossible or slow for a human to do that a computer could do instead (without having developers specifically work for months to provide explicit instructions on how to do it). Kinda weird to see technology evolving like this and still set arbitrary defining parameters like that

          • Ookami38@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I mean, think of it like physical tools. You can use a screwdriver like a hammer, but it’s slow, not what it was designed for, has a higher chance of injury, etc. but if it’s something better done with a hammer, well… That’s a hammer task, not a screwdriver.

            “AI tasks” would then be things that aren’t as easily solved with other tools. You run into a lot of issues with the refrigerator and AI. You can’t easily just visually verify what things are. What if you don’t have a standard package, and are using, say Tupperware. Or you have a jar with some milk and a jar with some cream. Those aren’t as simple as just having a camera look at it and figuring it out.

            In this case, a more simple, manually (either typed or scanned if packaging allows) managed DB would be much better for the refrigerator itemization. Then, for the “finding best prices” problem, there already exist some apps that do that, but I could see having an AI implemented just in this step to potentially be beneficial depending on how you’re finding sale info.

          • fl42v@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 year ago

            Hmm, guess I wasn’t clear. It’s not “arbitrary defining parameters”, but more of “ai is a tool that better solves specific types of tasks” kind of thing. Can you replace an if statement with an ai? Yes, but that’s somewhat like hammering a screw (that is to say, inefficient).

      • Programmer Belch@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        AI is just a program that learns from the information you provide it to predict the next element in a series.

        If you want a program to check whats in your fridge, a simple spreadsheet updated whenever you empty a bag is just as good.

        An use for AI could be to update the spreadsheet with images from the inside of the fridge but you would need cameras that can work inside fridges.

    • TrickDacy@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Why are so many of you trivializing the fact that providing perfectly formatted input data that having set logic figure something out is a VERY different thing than providing a firehose of data and then asking the software to make sense of it? Like have you been paying attention here at all?

      • lad@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        In this case I would suppose that there’s no need to get firehose of data, especially if run locally. The user only has so many shops around and the fridge is not a factory scale big

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yeah true. I guess I should have said a mish mash of data. It’s more about the fact that the data wouldn’t necessarily be in some regular format – the majority of the work you want the machine to do is find and compile that data

    • CluckN@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      AI could potentially do, “write me a python script that scrapes a website for grocery prices and compares them with another” or something.

    • EnderMB@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      In my experience, most things touted as AI are mostly rule-based or graph-based, with a sprinkling of some classification somewhere for a manager to get that sweet VC money.

      That’s not to say that this couldn’t be done with AI, particularly one that is trained on top of a rule-based system to find the best options for given circumstances.

  • where_am_i@sh.itjust.works
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    I’m sure Sara is not ready to be served the optimal outcome from a competitive multi-agent simulation. Because when everyone gets that AI, oh boy the local deals on groceries will be fun.