dis way, the person cn finally use their smart knowledge instead of doing boring work at company <3

i rlli hav to get a better structure for my funis on 196 so peeps dun ignore… current way is jus me infodumpin n evn peeps who lik my posts dun see them, cuz they r jus so boring to look at… fair point ;(

anyway i hop dis one … is a bit intrsting…

EDIT:
im so srri to tell u peeps… turns out all of u liv lik 500 years away!! ;( i rlli wish i cud visit u peeps n we cud rlli do dis!!! if someone near Germany - Rheinlandpfalz - hit me up we’ll try do dis!!!

if not- -lik if u liv in US, pls maybsies find othr peeps in comments to do similr thing!!! yall peeps prolli mor comf to be aorund than me soo u cn jus hav ur own magical adventrrrr <3 <3

  • Smorty [she/her]OP
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    hol on which “ai exam” did u take? did u lik hav to re-make back-propagation n stuff??!! cn u teach me!!!?? ~ ~ ~

    i wrote a neural network inference thingy som time ago, but jus couldn get the back propagation thing down… how do u kno how much to weigh each of the activations afterward in the model to affect the current node!!! AAAA

    • yetAnotherUser@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Honest answer: ¯\_(ツ)_/¯

      Backpropagation was part of the course but not only did hardly anyone understand it (where did the 20 different variables in 10 derivatives come from???) it’s also not a topic which can be adapated to exams easily.

      So I ignored it and just learned back propagation exists and makes everything work.

      It was a mostly theoretical course with some Python exercises to get at least some practice.

      • Smorty [she/her]OP
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        oooh das so sad ;(

        i wan to kno all the stuffs … i alrdi kno how to inference… bt nt how to train ;(

        hmmm thads why we need nvidia person!! ~ <3 🧸 🍼 🛌 💖

        so lik - oki… r u genuinli intrstd in doin dis, or not? im.

        • theneverfox@pawb.social
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          For training (and I assume you mean finetuning, actual training from scratch is crazy) you basically gotta pick a method, pick a framework to do your method (or read a white paper and do some crazy math to do it yourself), and then you gotta clean up your inputs

          There’s three main branches of approach from how I see it - you take a model and set of training data and make the model better fit the new training data, you take two or more models and combine them, or you take one model to create another model (by using one to train the other, using one model to shrink it, or using one model to upscale it)

          I haven’t done any of this yet, but when I browse through hugging face you occasionally see the config/manifest they used to create one model from another with whatever library they used. I can keep an eye out for some examples for you if you like

          Granted, this stuff is all just theory to me… I understand it conceptually, but I’ve mostly just used models so far

          • Smorty [she/her]OP
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            hmmm ysiyis ur right! <3 the methods ur refern to are

            • standard fine-tuning (LoRa or full)
            • model merging
            • distilling

            n ---- yis these approachs wrok, bt thads all using boring, predicatbl, python or javascript libraries… n i dun lik using premade boring sutuff! i wan to learn fulli!!! how to back-propagate myself! how to actulli adjust params from bac to fron. i wan to kno which percentg of the reuestd change in fron is de bias change, how mch de weights change from back activations, n how that split goes, u kno? maybsies is jus 50% n im bein dum bout it >v<

            nt borin librari!!! lets mek own, supr slo, bad librari which does job!!! becuzz!!!

            • we lern!! ~ 📘 🏫
            • we undrstnd blobcat, thinksmart
            • we cn say “we made dis” n open-sourc it <3
            • we fel comfff when doin it! ~ ~ <3 🧸 🛌 💖 🍼 👩‍🍼

            n thos r good reason!!

            i dun wan “experience using premade boring framewrorkrs”. i wan to

            • build own train n inference pipelin
            • curate own dataset (totalli curated, nt jus scraped from wikipeda >v<)
            • run modl n see it genrate gibbrish wif som wikipeda syntax!!!

            n thn release, open sourc n feel comf <3

            dis whad i wan!! ~ ~

            • theneverfox@pawb.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              24 hours ago

              Hmm… Well if you want to learn how to build a car you could kidnap an auto engineer, or you could find a really simple car and rip it apart to get to its delicious secrets

              At least that’s what I’d do, I’m an engineer, not a mathematician. If you understand what they’re trying to do and you can see how they did it, you can play with the pieces until you understand. And that way, you can learn things they never knew when they built it

              • Smorty [she/her]OP
                link
                fedilink
                English
                arrow-up
                1
                ·
                24 hours ago

                yisyis bt i wan to hav nvidia prsn so they cn finalli do fun stuff with their knoledg instead of boring finfance n bugfixes ;(

                • theneverfox@pawb.social
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  23 hours ago

                  Well if you do, try to grab someone from the team doing the new robotics models…I want to take that apart and play with it XD

                  • Smorty [she/her]OP
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    21 hours ago

                    isn’t Nvidias Groot N1 model already openly available?.. or am i confus…