Absolutely needed: to get high efficiency for this beast … as it gets better, we’ll become too dependent.
“all of this growth is for a new technology that’s still finding its footing, and in many applications—education, medical advice, legal analysis—might be the wrong tool for the job,”
as it gets better
Bold assumption.
Historically AI always got much better. Usually after the field collapsed in an AI winter and several years went by in search for a new technique to then repeat the hype cycle. Tech bros want it to get better without that winter stage though.
That’s part of why they installed Donald Trump as the dictator of the United States. The other is the network states.
AI usually got better when people realized it wasn’t going to do all it was hyped up for but was useful for a certain set of tasks.
Then it turned from world-changing hotness to super boring tech your washing machine uses to fine-tune its washing program.
Like the cliché goes: when it works, we don’t call it AI anymore.
The smart move is never calling it “AI” in the first place.
Unless you’re in comp sci, and AI is a field, not a marketing term. And in that case everyone already knows that’s not “it”.
The major thing that killed 1960s/70s AI was the Vietnam War. MIT’s CSAIL was funded heavily by DARPA. When public opinion turned against Vietnam and Congress started shutting off funding, DARPA wasn’t putting money into CSAIL anymore. Congress didn’t create an alternative funding path, so the whole thing dried up.
That lab basically created computing as we know it today. It bore fruit, and many companies owe their success to it. There were plenty of promising lines of research still going on.
I wish there was an alternate history forum or novel that explores this scenario.
Pretty sure “AI” didn’t exist in the 60s/70s either.
The perceptron was created in 1957 and a physical model was built a year later
Yes, it did. Most of the basic research came from there. The first section of the book “Hackers” by Steven Levy is a good intro.
The spice must flow
Each winter marks the beginning and end of a generation of AI. We are now seeing more progress and as long as there is no technical limit it seems that its progress will not be interrupted.
What progress are we seeing?
NVL72 will be enormously impactful on high end performance.
Historically “AI” still doesn’t exist.
Technically even 1950s computer chess is classified as AI.
The issue this time around is infrastructure. The current AI Summer depends on massive datacenters with equally massive electrical needs. If companies can’t monetize that enough, they’ll pull the plug and none of this will be available to general public anymore.
This system can go backwards. Yes, the R&D will still be there after the AI Winter cycle hits, but none of the infrastructure.
We’ll still have models like deepseek, and (hopefully) discount used server hardware
Yeah, I think there was some efforts, until we found out that adding billions of parameters to a model would allow both to write the useless part in emails that nobody reads and to strip out the useless part in emails that nobody reads.
I want my emails to be a series of noises that only computers can hear and communicate with
The energy issue almost feels like a red herring for distracting all idiots from actual AI problems and lemmy is just gobbling it up every day. It’s so tiring.
Partly, yep. Seems like every time I try to pin down an AI on a detail of a question worth asking - a math question, or a date in history, it’ll confidently reply with the first answer it finds … right or wrong.
I don’t think accuracy is an issue either. I’ve been on the web since inception and we always had a terribly inaccurate information landscape. It’s really about individual ability to put together found information to an accurate world model and LLMs is a tool just like any other.
The real issues imo are effects on society be it information manipulation, breaking our education and workforce systems. But all of that is overshadowed by meme issues like energy use or inaccuracy as these are easy to understand for any person while sociology, politics and macro economics are really hard.
deleted by creator
lemmy is just gobbling it up every day. It’s so tiring.
Are you fucking serious? All I ever see on Lemmy is prople saying “AI slop” over and over and over and over again… in like every comment section of every post. It could be a picture that was actually hand-drawn, or a photograph that was definitely not AI, or articles written by someone “sounding like AI”. The AI hate on Lemmy is WAY overpowering any support.
I think you misunderstood me here as we’re in agreement already
I definitely did! I guess maybe you can see why I was so exasperated. 😳
How does crypto mining play into all of the electrical need? I know they used to use a butt load.
I found this article from last year: https://www.eia.gov/todayinenergy/detail.php?id=61364
Our preliminary estimates suggest that annual electricity use from cryptocurrency mining probably represents from 0.6% to 2.3% of U.S. electricity consumption.
The wide range should not be too surprising, it’s a mess to keep track of, especially with the current administration. Since then, with Trump immediately pledging to support the “industry”, I can only imagine it consuming even more now.
That’s a huge amount of electricity even at it’s lowest. Are they building the AI to crypto mine is also another question. I could see these sneaky bastards combining the two somehow.
As for OpenAI and Microsoft, they’re betting on energy with a company called Helion Energy. They say they’ll have it ready by 2028. Whether they’ll achieve that? We’ll see.
I looked that up and it’s a fusion reactor, apparently, from what I can figure out from their nigh-illegible website.
…[reads further] …
It skips the steam cycle. That’s fucking cool. I really hope they get it working.
They won’t.
Care to elaborate?
there’s lots of handwavy things. They even said they “discovered new physics” to explain some disappointing (perfectly predictable, by other established nuclear physicists) results. “Oh the physics equation was wrong, we just need to make everything 25% bigger”.
Also, the emitted radiation levels will be insane once it’s scaled up
No.
I see you don’t know how crypto works lol
I don’t see how AI helps with crypto mining. It could help with pump and dumping shit coins though.
I was thinking how it probably helps by getting into nanoseconds with buying and selling stocks. I have found that there are very few coincidences with huge technologies, tons of cash pouring in and dark money. I could be wrong though.
Trading has nothing to do with cryptocurrency mining. Also, any high-frequency trading firm worth their salt is using FPGAs for the things where performance really counts.
It should be clarified that it’s 99.99% Bitcoin mining that’s wasting all that energy, any other crypto that still uses mining is basically irrelevant when compared to it
Solar powered server farms in space. Self-powered, self-cooling, ‘outside the environment’. Is this a stupid idea?
Edit: So it would seem the answer is yes. Good chat :) Thanks.
very stupid. One of the most difficult things in space is cooling stuff. Sending up a bunch of space heaters in a box (almost all of the energy pumped into a computer is turned into heat. The actual computatiion takes next to nothing in comparison) is definitely not a good idea. Definitely not one thought up by a technical person.
also, you can make computers much more cheap and reliable, more maintainable and much much faster, if you protect them from space radiation by operating them down here, under the protection of earth’s atmosphere.
Launch cost is astronomical.
Maintenance access is horrible.
Temperature delta is insane, upto 250C.
I don’t understand the self-cooling. Isn’t it harder to keep things cool in space since there is no conduction or convection cooling? I mean everything is in a vacuum. The only place for heat to go is radiative and that’s terribly inefficient. Seems like a massive engineering problem.
It is, infrared radiators weight a shit ton and are inefficient, big and unwieldy. Still the only viable option for cooling in space. AI would take an hugemongous square footage of it just so the GPUs won’t melt.
I thought so.
You can cool servers way better on Earth than you can in space. Down here you can transfer heat away from the server with conduction and convection, but in space you really only have radiation. Cooling spacecraft is an engineering challenge. One might imagine a server stuck inside a glass thermos that’s sitting out in the sun.
If the end goal is so little Timmy can ask a robot if nazis exist and it spits out misinformation or so Ai bots can flood social media with endless regurgitated bullshit, then no, it’s just more garbage in space.
Ai is interesting,… necessary? A lot of people can be fed and housed for the cost of giant, experimental solar powered Ai computers in space so that they have more excuses not to pay people a living wage.
When I think about the potential of AI, I like to think of Iain M. Banks’ Culture more than Skynet. We could probably all live in a post-scarcity society even without AI if we put our minds into it, but let’s free ourselves of unnecessary or unwilling labour while we’re at it, eh?
I’m glad someone’s hopeful. Any time I see a new technology, I wonder what the worst possible outcome could be, and it usually makes it there.
Sorry, I just have zero faith in humanity.
Oh, I have zero expectation of an actual positive outcome. I don’t think the tech-bros read. Or think ahead.
Afaik space isn’t self cooling. Overheating of spacecraft is a thing. I think they can only cool through infrared radiation or something.
Do you know how much energy you need to launch a kilogram into Earth orbit?
So someone thinks it’s not entirely stupid
I assume yes, I know very little but I know space is very hard and harsh environment. Also it would be very expensive I assume. And it would need to be big.
Fire bad, who cook with fire, fire burn, fire pollute, fire baaaaad
Yes, “AI” is literally contributing to the burning of the planet.
https://www.cleanairfund.org/news-item/wildfires-climate-change-and-air-pollution-a-vicious-cycle/
So is the computer you’re using
Nothing wrong with examining potential issues for emerging technology before they become actual issues.
For sure, and I agree. But that isn’t what happens around here. Instead we turn to panic rather than skepticism. We are cynical more than anything. And I don’t trust cynicism about topical subjects.
This wasn’t well reasoned objectivism. It was journalist and artist fearing for their jobs about technology they don’t understand. They generate a lot of content to generate panic. The mob saw the panic and adopted it. You’re not a true lefty unless you accept that AI is some new danger to the threat of the lowly creative.
Do you disagree that AI presents an existential threat to our current way of life, and that a new way of life may be worse, and that we should therefore plan ahead before plunging in?
I’m going to ask you something and please think about it. I have a belief that media highlighted these dangers in the same way the right wing media highlighted the dangers of immigration. If I look at articles, the framework is the same. The details are different, but it’s the same scaffolding. That puts my hair up.
So to answer your question, yes I think AI poses a danger. That doesn’t mean I don’t embrace it or look forward to it positively. I believe we need to push the other way. Embrace and seize it. Force awareness of it and force legislation or cultural views that puts it in a better place for these tools to remain open Source. We have to be the primary consumers to influence it.
I’d say I agree. Proceed, but with caution.
Also did I miss your question? I didn’t see one
No I started as a question and changed to a statement but didn’t correct 😆
What’s your point here?
They’re trying to compare “AI” to fire. If you don’t see the point, I can’t blame you.
Does the article answer the question of what is the footprint of a prompt?
Depends on the prompt, the model, the parameters, which DCs, time of day, location in the world, and other factors. They answer the question but there’s so many variables that can affect footprint (and big hyperscalers do not release this data so you have to under a lot)
Basically nothing worth getting angry about
The matrix is getting more and more real every day
deleted by creator