toad.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Mastodon server operated by David Troy, a tech pioneer and investigative journalist addressing threats to democracy. Thoughtful participation and discussion welcome.

Administered by:

Server stats:

227
active users

#parameter

0 posts0 participants0 posts today

technologyreview.com/2025/05/2

This is an actually good article about #AI #energy use, and it's affect on #carbon, and #ClimateChange.

Here are some key things I would add.

All of these #corporations like #OpenAI #Microsoft and #Google are relying on an #LLM being accessed from the #cloud (the #internet).

#NVIDIA is counting on every increasingly big models.

Do I think this is the future of #generativeai

No.

I think _that_ is a big #Bubble. I think every "Size Up" on an AI model gives you an extra 20% in quality.

So, running a LLAMA 8b is only 20% better than running a LLAMA 3b.

Right?

So what?

Well, 99% of the usecase people have don't require 1 Trillion #parameter models.

They require models that increasingly can be run locally.

What do I think is coming? It isn't 30gigawatt data centers, it's laptops running on less energy than current ones with a terrabyte of RAM.

No one wants the #internet that the #MegaCorporations have created. People want their own shit. They want to own their movies.

The #cloud is dead. The companies built on the cloud are dead.

The just don't know it yet.

@mittechreview

MIT Technology Review · We did the math on AI’s energy footprint. Here’s the story you haven’t heard.By James O'Donnell

Just had an interesting (but a bit unsettling) conversation with #AI about what it thought its future abilities would be. It’s a long response but worth the read.

My question: What new #emergent #abilities will occur when #parameter #numbers can be increased by orders of #magnitude?

AI’s response: This is a fascinating and actively researched question in AI! As language models like GPT grow in parameter count by orders of magnitude (e.g., from billions to tens or hundreds of trillions),

1/8

#creepmas
#vss365
The trees above hung heavy with clumps of mistletoe, their thick green leaves and white berries bright against the naked grey branches coated in frost.
"The calculated #parameter is right Cpt."
I nod, loading a clip of AP rounds.
"Well, Santa hunts on kiddos."
The sound of soldiers switching from tranqs to live silver rounds rang through the cold air.
"Operation Naughty List is a go...!"
"HO... HO... HAROOOO!"
I looked around, "We save Christmas one way, take down Lycan Claus."

So I've noticed my #LLM #HomeAssistant sometimes signing its #notifications. But it doesn't have a name, so it will either #hallucinate one or use my name or my wife's name. I could try to get it to just not do that, but it is a #7B #parameter model, so it often will try to comply but get hung up on the "structure" of what it thinks it is doing. So rather than beating it down, I was thinking I should just name it.

Any #fun names for an #automated home assistant? :wizard:

Replied in thread

#Kurtosis was an important #parameter differentiating #size distributions, with #platykurtic distributions in #marls and #leptokurtic distributions in #limestones, suggesting that this parameter may reflect different degrees of #time #averaging. Most size #distributions were positively #skewed, but most strongly in marls. Complete #sampling led to #skewness values close to zero (#symmetrical distributions) and high kurtosis.

doi.org/10.2110/palo.2021.063