I don’t get why only four of these are jokes
- Push directly to master, not main
- No command line args, just change the global const and recompile
- No env vars either
- Port numbers only go up to 5280, the number of feet in a mile
- All auth is just a password; tokens are minority developers, not auth, and usernames are identity politics
- No hashes – it’s the gateway drug to fentanyl
- No imports. INTERNAL DEVELOPERS FIRST
- Exceptions are now illegal and therefore won’t occur, so no need to check for them
- SOAP/XML APIs only
- No support for external machines. If it’s good enough for my machine, it’s good enough for yours.
Exceptions are now illegal and therefore won’t occur, so no need to check for them
Ah, I see you’ve met C++ developers.
No command line args, just change the global const and recompile
Nah, don’t use global variables, magic values everywhere. And don’t use const whatsoever, we need to move fast and break things, we can’t let something immutable stop us
Main branches will be renamed Master
More like Grandwizard
Nope Main branches will be renamed Daddy
MAGA - Make Assembly Great Again
Error handling should only be with “if”
Variable names must be generic and similar to each-other
Debugging is only done with prints
Version numbers must be incoherent, hard to order correctly, contain letters and jump in ways that don’t align with the updates done.
Single letters or UTF8 symbols only. Emojis are encouraged.
Variable names should be var{n} where n = 0, 1, 2…
Pff, just use the numbers directly:
${1} = "value"; ${2} = "DOGE";
That makes it possible to do stuff like:
for (${152} = 1; ${152} <= 2; ${152}++) { ${666} = $${152}; print(${666}); }
This is a valid code, btw.
Something something the cruelty is the point.
reverting main back to master
Yeah…this one is sadly on brand
Sadly? Master branch never implied the existence of a slave branch. It was one of the dumbest pieces of woke incursion into tech.
It was kind of pointless, but at least it made software work with custom default branches.
Yes exactly. It’s a reference to the recording industry’s practice of calling the final version of an album the “master” which gets sent for duplication.
In alignment with this, we should not replace the master branch with the main branch, we should replace it with the gold branch.
Every time a PR gets approval and it’s time to merge, I could declare that the code has “gone gold” and I am not doing that right now!
Merged -> gone gold
Deployed -> gone platinum
Gone a week without crashing production -> triple platinum
But why even? There’s no risk to changing it and some risk to keeping it. That’s the reason for the push to change it. Keeping something just because it’s tradition isn’t a good idea outside ceremonies.
It’s the principle of letting uneducated people dictate what words are acceptable to us
letting uneducated people
More like overeducated people
overeducated people who can’t see that “master” has multiple meanings.
What makes you think they’re uneducated?
Yeah agreed. Just another piece of white devs acting like they knew better for everyone.
For this political correctness you get trunk.
Git default branch renamed back from main to master
(Someone else made it but I can’t find the source)
and all the others start with “slave/”
Merge me senpai
That one actually seems plausible, if he ever learns about that whole thing
Would be the most sane thing he’s ever done.
From this point on, all arrays are reverse-indexed.
♾️-0 ♾️-1 …
Hey now, you know that according to the Bible the biggest number is a million. Anything larger than that including infinity is some of that “woke shit”.
Your array will be 999,999, 999,998, 999,997 …
Halfway to Lua lol
What about stacks grows to higher addresses?
Im unfamiliar with this as well. If you are allocating memory for a stack, why does it matter which direction it populates data? Is this just a convention?
I asked deepseek: Downward-growing stacks** are more common in many architectures (e.g., x86, ARM). This convention originated from early computer architectures and has been carried forward for consistency.
Funny, I can’t remember, because I did a lot of assembler.
Ah thank you so its just a convention.
GTFOH with that. 1-indexed arrays?! You monster.
(Mostly joking… Ok, somewhat joking :P )
Lua has entered the chat
Lua had been banned from the chat
In Lua all arrays are just dictionaries with integer keys, a[0] will work just fine. It’s just that all built-in functions will expect arrays that start with index 1.
Your argument isn’t making me any happier - it just fills me with more rage.
That’s slightly misleading, I think. There are no arrays in Lua, every Lua data structure is a table (sometimes pretending to be something else) and you can have anything as a key as long as it’s not nil. There’s also no integers, Lua only has a single number type which is floating point. This is perfectly valid:
local tbl = {} local f = function() error(":(") end tbl[tbl] = tbl tbl[f] = tbl tbl["tbl"] = tbl print(tbl) -- table: 0x557a907f0f40 print(tbl[tbl], tbl[f], tbl["tbl"]) -- table: 0x557a907f0f40 table: 0x557a907f0f40 table: 0x557a907f0f40 for key,value in pairs(tbl) do print(key, "=", value) end -- tbl = table: 0x557a907f0f40 -- function: 0x557a907edff0 = table: 0x557a907f0f40 -- table: 0x557a907f0f40 = table: 0x557a907f0f40 print(type(1), type(-0.5), type(math.pi), type(math.maxinteger)) -- number number number number
PHP did that same thing. It was a big problem when algorithmic complexity attacks were discovered. It took PHP years to integrate an effective solution that didn’t break everything.
Fortran angrily starts typing…
Don’t do my boy Lua dirty like that >:(
I always felt that Lua was a girl
Lua - Portuguese feminine noun for “moon”, coming from the Latin “luna”
Luna - Latin, feminine noun (coincidentally identical to the Italian noun, also feminine)Yup, Lua is a girl.
and MATLAB, Visual Basic (with
Option Base 1
), and SQL.Writing Lua code that also interacts with C code that uses 0 indexing is an awful experience. Annoys me to this day even though haven’t used it for 2 years
This is one of the few things that I really don’t like any Lua. It’s otherwise pretty decent and useful.
Visual Basic used to let you choose if you wanted to start arrays at 0 or 1. It was an app-wide setting, so that was fun.
I’ve not heard that name in a long time…
It’s how I got into programming, so I’ll always have a soft spot for it. Now it’s over 20 years later and I’m still coding.
Apple Basic (on an Apple IIe) was my first language that I recall.
Didn’t have a computer powerful enough for VB until later. It does have a special place in my nostalgia zone but has also led so many astray.
How is arrays starting at 1 still a controversial take. Arrays should start at 1 and offsets at 0.
Arrays are address offsets.
So what’s 0 do then? I’m okay with wacky indexes (I’ve used something with negative indexes for a end-index shorthand) but 0 has to mean something that’s actually useful. Using the index as the offset into the array seems to be the most useful way to index them.
He’s got to be in contact with the CEO of my company, this is trade secret theft if not…
Arrays not starting at 1 bother me. I think the entrenched 0-based index is more important than any major push to use 1 instead, but if I could go back in time and change it I would.
this is what messed me up with ZSH for a bit, having a shell default to 1 instead of 0 was weird
Rare zshell L
It really doesn’t make sense to start at 1 as the value is really the distance from the start and would screw up other parts of indexing and counters.
It would screw up existing code but doing [array.length() -1] is pretty stupid.
For i = 0; I < array.length; i++
Casually throws in capitals as well.
i < array.length
or else you overflow.
A lot of languages have a
.last()
or negative indexer ([-1]
) to get the last item though.
It doesn’t make sense that the fourth element is element number 3 either.
Ultimately it’s just about you being used to it.
Yeah, but if we went back and time and changed it then there wouldn’t be other stuff relying on it being 0-based.
It was not randomly decided. Even before arrays as a language concept existed, you would just store objects in continuous memory.
To access you would do $addr+0, $addr+1 etc. The index had to be zero-based or you would simply waste the first address.
Then in languages like C that just got a little bit of syntactic sugar where the ‘[]’ operator is a shorthand for that offset. An array is still just a memory address (i.e. a pointer).
I know. But in the alternate reality where we’d been using 1-based indices forever you’d be telling me how useful it is that the first element is “1” instead of zero and I’d be saying there are some benefits to using zero based index because it’s more like an offset than an index.
Also remove null reference
NGL, this kind of form of putting the decisions the monkey-in-charge is making in a way experts in a field will understand, is a very good way to showcase the absurdity.
Are there really people capable of understanding this who aren’t capable of understanding, for example, “tariffs increase inflation”?
Haven’t heard of the stack address thing, anyone got a TLDR on the topic?
TL;DR: For historical reasons stacks growing down is defined in hardware on some CPUs (notably x86). On other CPUs like some ARM chips for example you (or more likely your compiler’s developer) can technically choose which direction stacks go but not conforming to the historical standard is the choice of a madman.
Pretty sure that it’s something a long the lines of “stack begins high, grows down, while heap behind low grows high” when they meet, it’s a stack overflow
They don’t have to meet, the max stack size is defined at compile time
Dynamic stacks are pretty common in the most popular scripting languages, but considered bad practice from folks who use systems languages