toad.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Mastodon server operated by David Troy, a tech pioneer and investigative journalist addressing threats to democracy. Thoughtful participation and discussion welcome.

Administered by:

Server stats:

275
active users

#hash

1 post1 participant0 posts today

»Per Coredump – Angreifer können unter Linux Passwort-Hashes abgreifen:
Mehrere Versionen von @ubuntu, @fedora und RHEL sind angreifbar. Böswillige Akteure können Anwendungen crashen und vertrauliche Daten erbeuten.«

Mist aber auch! Ich muss mir dies noch genauer ansehen, auf welche Rechner welche Updates eingespielt werden muss, wenn dies nicht schon automatisch geschah.

🐧 golem.de/news/per-coredump-ang

Golem.de · Per Coredump: Angreifer können unter Linux Passwort-Hashes abgreifen - Golem.deBy Marc Stöckel

I went to 4/20 at Washington square park this year and linked up with some people I knew. Highlights were meeting jolly tiger, big time streamer and CEO of puffco. They gave so so much swag including a free hot knife. I got some incredible pakastani land race from a Pakistani American who gets the stuff from his town back in the old country. I also scored some authentic blue lobster and two jars of rosin that were cheese crosses. 1/?
#puffco #cannabis #hash #rosin

Valkey 對 hash 資料結構的改善

在 Lobsters 上看到「A new hash table」這篇,講 Valkey 對 hash 資料結構的改善。 其中 Valkey 是 Redis 在捨棄 open source license 後興起的專案。這類拋棄 open source license 的專案中,有個關鍵的點是「原廠」與「社群」之間的貢獻比率。 如果整個專案都只有「原廠」的人在參與,那 fork 出來的版本就不會有太多前景了;反過來如果「社群」有提供不少貢獻,那 fork 出來的版本就有機會欣欣向榮。 Valkey 的這個 hash 的改善本身固然很重要 (因為 hash 算是很基礎的資料結構,有很多地方會用到),另外一方面,這代表了「社群」這邊的能量是存在的,就很值得期待了。 …

blog.gslin.org/archives/2025/0

Gea-Suan Lin's BLOG · Valkey 對 hash 資料結構的改善在 Lobsters 上看到「A new hash table」這篇,講 Valkey 對 hash 資料結構的改善。 其中 Valkey 是 Redis 在捨棄 open source license 後興起的專案。這類拋棄 open source license 的專案中,有個關鍵的點是「原廠」與「社群」之間的貢獻比率。 如果...

Guys I've been thinking about this recently
So the telegraph has existed for a while, and became widespread in 1800s. Charles Babbage worked on the first mechanical #computers in the 1820s

What would be the earliest point in time in which a #Bitcoin like #Blockchain could've been made?

As I understand it all the system needs is a #network of computers each running a program that checks for transactions

I'm wondering if a different hash function was used it could be a lot simpler to implement into hardware, but if mechanical computing wasn't powerful enough for that the earliest might've been after WWII with the code breaking machines that the Allies invented

Any boosts would be greatly appreciated!

Continued thread

Genetics are thought to be pollen from a Grapefruit Kush male, bred by Tony Mendocino, crossed with a mystery clone from Gas Station Bob. Only got a few seeds and one became the Z. 10 years running, she still occupies the podium at most hash competitions. I’ve been a Z Stan since 2016. #cannabis #hash #ca (2/2)

#programming question, TL;DR: How to test for an (approximately) uniform #distribution?

Today at work, I created a piece of code that should #partition a stream of data entities based on some string keys of unknown format. The only requirements were that the same key must always be assigned to the same partition and the distribution should be approximately uniform (IOW all partitions should have roughly the same size). My approach was to apply a non-cryptographic #hash function to the keys (defaulting to #xxhash3), XOR-fold the hash down to 32 bits and then take this as an unsigned integer modulo the desired number of partitions.

I normally only code my private projects (as a software architect, I rarely have the time to touch any code at work, unfortunately), and there, I'd certainly test something like this on some large samples of input data, but probably just once manually. 🙈

But for work, I felt this should be done by a #unittest. I also think at least one set of input data should be somehow "random" (while others should contain "patterns"). My issue is with unit-testing the case for "random" input. One test I wrote feeds 16k GUIDs (in string representation) to my partitioner configured for 13 partitions, and checks that the factor between the largest and smallest partitions remains < 2, so, a very relaxed check. Still doubt remains because there's no way to guarantee this test won't go "red" eventually.

I now see several possible options:
  • just ignore this because hell freezing is more likely than that test going red ...
  • don't even attempt to test the resulting distribution on "random" input
  • bite the bullet and write some extra code creating "random" (unicode, random length within some limits) strings from a PRNG which will produce a predictable sequence

What do you think? 🤔 The latter option kind of sounds best, but then the complexity of the test will probably exceed the complexity of the code tested. 🙈