dis way, the person cn finally use their smart knowledge instead of doing boring work at company <3
i rlli hav to get a better structure for my funis on 196 so peeps dun ignore… current way is jus me infodumpin n evn peeps who lik my posts dun see them, cuz they r jus so boring to look at… fair point ;(
anyway i hop dis one … is a bit intrsting…
EDIT:
im so srri to tell u peeps… turns out all of u liv lik 500 years away!! ;( i rlli wish i cud visit u peeps n we cud rlli do dis!!! if someone near Germany - Rheinlandpfalz - hit me up we’ll try do dis!!!
if not- -lik if u liv in US, pls maybsies find othr peeps in comments to do similr thing!!! yall peeps prolli mor comf to be aorund than me soo u cn jus hav ur own magical adventrrrr <3 <3
Please don’t change too much. I scroll your posts every time the internet starts making me sad. I like your infodumps.
hol on!!! ur de prsn from de othr post where i ws bein supr aggressiv becuz of evil words !!! :o
i rembr!!! i rembr thad name!!! hazel!! 🌰 its u !!! <3
alsuu lik uh,m–. how do u mean that lik “i scroll ur posts” … is that lik -
- u scroll thru de blahaj lemmi posts n see “oh, dis smorty1!!! hi smorty!!”
or is it mor lik
- u go out of ur way to find my profil n scroll to see what dum commnts i post on all the things
anyway - i see it as a complimnt!! <3
(if ur not bein too spoopy bout it <3 )It’s me 😘 and I don’t think you were being aggressive at all. Just sensitive. I think we’re all a little edgy about being mislabelled.
It started out as the former. Every time a post made me feel warm and cosy, I noticed it was your name attached to it. Now I have you pinned to the feeds in my Lemmy client so I can specifically look at Smorty content and — not dumb — adorable comments when I want that feeling. I hope that’s not too spoopy. 💙
is totalli nt spoopi!— is jus honorabl!!!
i sit lik a princess at the op of ur feed >~< (i couldn do dis in real lif tho- id die from bein lookd at…)
I mean I live reasonably close (i.e. <1 hour away) to RLP.
And I have passed my introduction to AI exam two weeks ago so I’m basically an expert.
Sounds like a great plan! Surely nothing will go wrong >:3
hol on which “ai exam” did u take? did u lik hav to re-make back-propagation n stuff??!! cn u teach me!!!?? ~ ~ ~
i wrote a neural network inference thingy som time ago, but jus couldn get the back propagation thing down… how do u kno how much to weigh each of the activations afterward in the model to affect the current node!!! AAAA
Honest answer: ¯\_(ツ)_/¯
Backpropagation was part of the course but not only did hardly anyone understand it (where did the 20 different variables in 10 derivatives come from???) it’s also not a topic which can be adapated to exams easily.
So I ignored it and just learned back propagation exists and makes everything work.
It was a mostly theoretical course with some Python exercises to get at least some practice.
oooh das so sad ;(
i wan to kno all the stuffs … i alrdi kno how to inference… bt nt how to train ;(
hmmm thads why we need nvidia person!! ~ <3 🧸 🍼 🛌 💖
so lik - oki… r u genuinli intrstd in doin dis, or not? im.
For training (and I assume you mean finetuning, actual training from scratch is crazy) you basically gotta pick a method, pick a framework to do your method (or read a white paper and do some crazy math to do it yourself), and then you gotta clean up your inputs
There’s three main branches of approach from how I see it - you take a model and set of training data and make the model better fit the new training data, you take two or more models and combine them, or you take one model to create another model (by using one to train the other, using one model to shrink it, or using one model to upscale it)
I haven’t done any of this yet, but when I browse through hugging face you occasionally see the config/manifest they used to create one model from another with whatever library they used. I can keep an eye out for some examples for you if you like
Granted, this stuff is all just theory to me… I understand it conceptually, but I’ve mostly just used models so far
hmmm ysiyis ur right! <3 the methods ur refern to are
- standard fine-tuning (LoRa or full)
- model merging
- distilling
n ---- yis these approachs wrok, bt thads all using boring, predicatbl, python or javascript libraries… n i dun lik using premade boring sutuff! i wan to learn fulli!!! how to back-propagate myself! how to actulli adjust params from bac to fron. i wan to kno which percentg of the reuestd change in fron is de bias change, how mch de weights change from back activations, n how that split goes, u kno? maybsies is jus 50% n im bein dum bout it >v<
nt borin librari!!! lets mek own, supr slo, bad librari which does job!!! becuzz!!!
- we lern!! ~ 📘 🏫
- we undrstnd
- we cn say “we made dis” n open-sourc it <3
- we fel comfff when doin it! ~ ~ <3 🧸 🛌 💖 🍼 👩🍼
n thos r good reason!!
i dun wan “experience using premade boring framewrorkrs”. i wan to
- build own train n inference pipelin
- curate own dataset (totalli curated, nt jus scraped from wikipeda >v<)
- run modl n see it genrate gibbrish wif som wikipeda syntax!!!
n thn release, open sourc n feel comf <3
dis whad i wan!! ~ ~
Hmm… Well if you want to learn how to build a car you could kidnap an auto engineer, or you could find a really simple car and rip it apart to get to its delicious secrets
At least that’s what I’d do, I’m an engineer, not a mathematician. If you understand what they’re trying to do and you can see how they did it, you can play with the pieces until you understand. And that way, you can learn things they never knew when they built it
yisyis bt i wan to hav nvidia prsn so they cn finalli do fun stuff with their knoledg instead of boring finfance n bugfixes ;(
woah fr??ß??!?!!!ß <3 <3 <3 great!!! ~ ~ ~ <3 < awawwwa <3 <3 <3 waw dis great news!!! ~ ~ <3 <3 <3 <3<3 <3 <3 <3
oki listn - here u cn see all de locations in grmni
i thinksies ofr me is de best if we go to Stuttgart bt i dunno bout ur location, so yea… im kinda centr RLP bt i rlli dun kno bout how u cn drive places so were gona hav to com up wif plan <3
i onli hav dis silli lil agents thingi from de huggingface, which doesn mean anythin… bt i am IT student!!! cn totalli do dis!!!
alsuu i duno if u hav car, i dun … cz i lik lil bus n train <3 <3 <3 (yisyis i lik em <3) suu were gona hav to figur out if u hav car or we gona hav to rent one or a lil transportr (so we cn mek sweet lil cuddy room for nvidia person in the back while we drive while their unconcious of kisses or jus vrri overwhelmed lik i was yestrday with thumping headache <3 <3 <3)
suu yis, dis plan! !!! <3 u hav matrix? we cn totalli do dis discussin about tchncs stuff ovr there if u nt comfi wif doin it here…
my name on dis is
@smorty:catgirl.cloud
bt u cn alsuu jus call me smorty <3 instead of AtSmortyColonCatGirlDotCloud <3
devilish plan smorty… count me in!
bt is not devlish… is jus fun! <3
nvidia prsn is comfi, we’re comfi n we release model ~ <3
Yisyisyis is a comfi adventure hopfuli which mek peeps happi n we cn hav totlli magic feels <3
As a non-native speaker the way you write is almost like a cipher to me…
this post is not as bad I think… im alsuu not native speakr, n lik - the post image has vrri few special tokens in it…
bt is fine! <3
Iz dér ä comjúnidý or eny adr spejs fór sač spešl vrajting of Ingliš? Aj mín, Aj fink dér uos uan lajk dát ouer on Reddit. Aj kánt rimembr uot it uos kóled dou.
I am proud to have deciphered that.
Text to speech from “Slovak”: https://files.catbox.moe/yqzj72.mp4
(It kept reading ä as “long a”, but it probably would be worse anyway)
Headpat Nvidia person
hmhmm!!! yisyisyis! headpat on kiss n huggg n jus - the comfiest things hopfulli!!! <3