• Smorty [she/her]
      link
      fedilink
      arrow-up
      12
      ·
      11 hours ago

      apparently not. it seems they are refering to the official bs deepseek ui for ur phone. running it on your phone fr is super cool! Imma try that out now - with the smol 1.5B model

        • Smorty [she/her]
          link
          fedilink
          arrow-up
          10
          ·
          11 hours ago

          i kno! i’m already running a smol llama model on the phone, and yeaaaa that’s a 2 token per second speed and it makes the phone lag like crazy… but it works!

          currently i’m doing this with termux and ollama, but if there’s some better foss way to run it, i’d be totally happy to use that instead <3

          • gandalf_der_12te
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            5 minutes ago

            i think termux is probably already the best way to go, it ensures linux-like flexibility, i guess. but yeah, properly wiring it up, with a nice Graphical User Interface, would be nice, i guess.

            Edit: now that i think about it, i guess running it on some server that you rent, is maybe better, because then you can access that chat log from everywhere, and also, it doesn’t drain your battery so much. But then, you need to rent a server, so, idk.

            Edit again: Actually, somebody should hook up the DeepSeek chatbot to Matrix chat, so you can message it directly through your favorite messaging protocol/app.

    • Clbull@lemmy.worldOP
      link
      fedilink
      arrow-up
      31
      ·
      11 hours ago

      A US senator has introduced a bill trying to criminalize it with a 20 year prison sentence.

      Also, the US and El Salvador are doing a prisoner deportation deal.

      I’m not making this up.

    • jmcs@discuss.tchncs.de
      link
      fedilink
      arrow-up
      4
      ·
      11 hours ago

      If it’s running locally nothing.

      If you install the actual deepseek app, you might as well post your prompts openly on social media.

      • Foni@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        11 hours ago

        I normally use it to help me with my English translations here on lemmy (I can more or less get by but I prefer it to be correct), so I already do it.

        • Soup@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          9 hours ago

          You can also use DeepL for that, which I’m willing to bet uses a lot less energy and is a very reliable translation service.

          • Foni@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            4 hours ago

            Yes and I use it more often, but sometimes a literal translation is not enough, I know enough English to notice when something is weird and in those cases I go to chatgpt or deepseek who understand what you mean and its context and give you the same thing and with the same tone in another language. I don’t use it all the time, but it greatly improves the result.

        • Smorty [she/her]
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          11 hours ago

          nono, the whole thing is about some people putting personal info into these chatbots.

          and even if not, they are guaranteed to train their newer models on the requests and generated responses.

          if ur putting personal info, running in locally/privately is kinda a must, if u care about security at all.

          i think peeps try lewd prompts once, then find out it doesn’t work, and then give up. (they don’t know about huggingface)