i jus wanted to get dis outta my system >v< …
i dun like those boring linear model structures… they work… bt they dun look fun, nor intuitive. they jus produce output… which is boring!
pls, if some researcher with lotsa gpus sees this, maybsies try this kinda architecture… u dont evn have to credit me, just try it out n see where it goes ~ ~ ~
They do, that’s called a Recurrant model.
And Recurrance is critical for a model to have “memory” of past inputs.
It was one of the key advancements awhile back in data processing for predictive systems, IE speech recognition.
Recurrence is pretty standard now in most neural networks, linear networks are your most basic ones that are mostly just used to demonstrate the basic 101 concepts of ML, they don’t have a tonne of practical IRL uses aside from some forms of very basic image processing and stuff like that. Filter functions and etc.
yeaaaa you’re right… i was referring specifically to LLMs, but yes, recurrent models are essentially everywhere else.
i am just surprised we don’t have many LLMs with recurrent blocks in them, like this model here did for example. i really hope we go that direction soon…
Afaik all LLMs have very derp recurrance, as that’s what provides their context window size.
The more recurrant params they have, the more context window they can store.