Me and my 9 disembodied heads judging people that don’t learn an adhoc functional programming language solely so they can run Linux
Me and my 9 disembodied heads judging people that don’t learn an adhoc functional programming language solely so they can run Linux


Followed quickly by “vibe bankruptcy proceedings”


NixOS is a dangerous drug, but at least it’s a drug without side effects, since Nix is purely functional.


I’m partial to Terra’s theme, but yeah, that whole OST is amazing.


I remember when Steam came out and everyone hated it because of how slow and buggy it was. Crazy how times have changed.


I don’t think having well-defined precision is a rare requirement, it’s more that most devs don’t understand (and/or care) about the pitfalls of inaccuracy, because they usually aren’t obvious. Also, languages like JavaScript/PHP make it hard to do things the right way. When I was working on an old PHP codebase, I ran into a popular currency library (Zend_Currency) that used floats for handling money, which I’m sure works fine up until the point the accountants call you up asking why they can’t balance the books. The “right way” was to use the bcmath extension, which was a huge pain.


I work at big tech (not MS) and yes, the comp package really is that good, though not as good as it used to be. I immediately doubled my total comp when I came here from my last job, and now it’s ~5x. I could retire right now if I wanted, so I don’t care about layoffs anymore.


Cuelang: https://cuelang.org/docs/reference/spec/#numeric-values
Implementation restriction: although numeric values have arbitrary precision in the language, implementations may implement them using an internal representation with limited precision. That said, every implementation must:
- Represent integer values with at least 256 bits.
- Represent floating-point values with a mantissa of at least 256 bits and a signed binary exponent of at least 16 bits.
- Give an error if unable to represent an integer value precisely.
- Give an error if unable to represent a floating-point value due to overflow.
- Round to the nearest representable value if unable to represent a floating-point value due to limits on precision. These requirements apply to the result of any expression except for builtin functions, for which an unusual loss of precision must be explicitly documented.


That works until you realize your calculations are all wrong due to floating point inaccuracies. YAML doesn’t require any level of precision for floats, so different parsers on a document may give you different results.


YAML doesn’t require any level of accuracy for floating point numbers, and that doc appears to have numbers large enough to run into problems for single-precision floats (maybe double too). That means different parsers could give you different results.


In a sense, AI is already fucking with everyone’s brain when it comes to mass-produced ads and propaganda.
Depends on the monkey
Image is accurate, since without bugs, the food chain collapses and takes society with it, and the survivors will have to migrate to rural areas that can support a hunter-gatherer lifrstyle.
I don’t not use Arch, by the way


Based and nixpilled


I agree, but I’m not sure it matters when it comes to the big questions, like “what separates us from the LLMs?” Answering that basically amounts to answering “what does it mean to be human?”, which has been stumping philosophers for millennia.
It’s true that artificial neurons are significant different than biological ones, but are biological neurons what make us human? I’d argue no. Animals have neurons, so are they human? Also, if we ever did create a brain simulation that perfectly replicated someone’s brain down to the cellular level, and that simulation behaved exactly like the original, I would characterize that as a human.
It’s also true LLMs can’t learn, but there are plenty of people with anterograde amnesia that can’t either.
This feels similar to the debates about what separates us from other animal species. It used to be thought that humans were qualitatively different than other species by virtue of our use of tools, language, and culture. Then it was discovered that plenty of other animals use tools, have language, and something resembling a culture. These discoveries were ridiculed by many throughout the 20th century, even by scientists, because they wanted to keep believing humans are special in some qualitative way. I see the same thing happening with LLMs.


I don’t know how I work. I couldn’t tell you much about neuroscience beyond “neurons are linked together and somehow that creates thoughts”. And even when it comes to complex thoughts, I sometimes can’t explain why. At my job, I often lean on intuition I’ve developed over a decade. I can look at a system and get an immediate sense if it’s going to work well, but actually explaining why or why not takes a lot more time and energy. Am I an LLM?


deleted by creator


Ph’nglui mglw’nafh Kevin Rose Digg wgah’nagl fhtagn
What about the M-expression version (f[x])?