Scottish: got the painters in.
Some things cross language boundaries.
Scottish: got the painters in.
Some things cross language boundaries.
The Centos “eight pointed star”?


Menu bar at the top at least makes some sense - it’s easier to mouse to it, since you can’t go too far. Having menus per-window like Linux, or like Windows used to before big ugly ribbons became the thing, is easier to overshoot. (Which is why I always open my menu bars by pressing ‘alt’ with my left thumb, and then using the keyboard shortcuts that are helpfully underlined. Window likes to hide those from you now since they’re ‘ugly’, and also makes you mouse over the pretty icons to get the tooltip that tells you what they are, which is just a PITA. Pretty != usable.)
Mac OS has had the menu at the top since before it was a multitasking OS. They had them there on the first Mac I ever used, a Mac Classic 2 back in 1991 or so, and it was probably like that before then too. It’s not like they’ve been ‘innovating’ that particular feature and annoying their users.
Generally, companies are trying to maximise profit, which means that the price will be reduced only when it’s stopped selling at the previous and they want to make sales the next, more price-conscious, segment of the market. They might want some quick bucks if the company is in financial trouble, or to ‘make the news’ with a sale if they need some publicity.
BG3 sold shedloads, is still selling shedloads, was on multiple games-of-the-year list and generally ranks amongst the best games of all time, often at the top; and Larian seem sufficiently flush with cash from the success of it. So like you say, don’t hold your breath waiting for a big sale, it doesn’t make sense for them to do that.


Data centre GPUs tend not to have video outputs, and have power (and active cooling!) requirements in the “several kW” range. You might be able to snag one for work, if you work at a university or at somewhere that does a lot of 3D rendering - I’m thinking someone like Pixar. They are not the most convenient or useful things for a home build.
When the bubble bursts, they will mostly be used for creating a small mountain of e-waste, since the infrastructure to even switch them on costs more than the value they could ever bring.


There’s times when I want to find “exact matches and nothing but” - searching for error messages, for instance - and that’s made much harder than it should be by AI bullshit search engines that don’t want you to switch off their “helpful” features. Considering moving to Kagi instead.


Zelda 3? You get fast travel quite early and the world is packed with stuff, it’s not absurdly huge. Doesn’t have that bloody owl in it either, telling you the obvious at great length.
Certainly not Wind Waker, anyway. Now there is a slow game.


“Mostly perfectly unless they’ve got anti-cheat, and you’ll be limited to 30 fps for most of the fancy-graphics titles.”
Actually pretty damn good. Considering the difficulty I had getting frames out of E33 on desktop, having it play reasonably on the go is impressive. 60 fps @ 4K made my PC sound like a vacuum cleaner and was warming up the whole house; really needs some of the upscaling trickery to be comfortable to play.


Best story, for sure. Most emotionally affecting is Majora’s, for me, but TP is close.
Don’t think the gameplay holds up. The Wii version is pure waggle, but even on the Gamecube, there’s a lot of filler - empty space and backtracking. Doesn’t respect your gaming time.


HDMI -> DP might be viable, since DP is ‘simpler’.
Supporting HDMI means supporting a whole pile of bullshit, however - lots of handshakes. The ‘HDMI splitters’ that you can get on eg. Alibaba (which also defeat HDCP) are active, powered things, and tend to get a bit expensive for high resolution / refresh.
Steam Machine is already been closely inspected for price. Adding a fifty dollar dongle into the package is probably out of the question, especially a ‘spec non-compliant’ one.


I’m going to guess it would require kernel support, but certainly graphics card driver support. AMD and Intel not so difficult, just patch and recompile; NVIDIA’s binary blob ha ha fat chance. Stick it in a repo somewhere outside of the zone of copyright control, add it to your package manager, boom, done.
I bet it’s not even much code. A struct or two that map the contents of the 2.1 handshake, and an extension to a switch statement that says what to do if it comes down the wire.


Python tkinter interfaces might be inefficient, slow and require labyrinthine code to set-up and use, but they make up for it by being breathtakingly ugly.
Java’s biggest strength is that “the worst it can be” is not all that bad, and refactoring tools are quite powerful. Yes, it’s wordy and long-winded. Fine, I’d rather work with that than other people’s Bash scripts, say. And just because a lot of Java developers have no concept of what memory allocation means, and are happy to pull in hundreds of megabytes of dependencies to do something trivial, then allocate fucking shitloads of RAM for no reason doesn’t mean that you have to.
There is a difference in microservices between those set up by a sane architect:
… and the CV-driven development kind by people who want to be able to ‘tick the boxes’ for their next career move:
We mostly do the second kind at my work; a nice Java monolith is bliss to work on in comparison. I can see why others would have bad things to say about them too.
Apart from being slow, having discoverability issues, not being able combine filters and actions so that you frequently need to fall back to shell scripts for basic functionality, it being a complete PITA to compare things between accounts / regions, advanced functionality requiring you to directly edit JSON files, things randomly failing and the error message being carefully hidden away, the poor audit trail functionality to see who-changed-what, and the fact that putting anything complex together means spinning so many plates that Terraform’ing all your infrastructure looks like the easy way; I’ll have you know there’s nothing wrong with the AWS Console UI.


True. Was thinking of indie games, of the kind I might develop myself., which would be limited to the languages I speak myself.
If you’re developing something where you’d expect enough international sales to hire a translation team, then Chinese would be a sensible first choice, followed by Spanish.


Closing in on 8% if you filter it by “English language only”. Chinese speakers overwhelmingly (almost exclusively) use Windows and make up about 30% of all Steam users, which skews the rest-of-world results. And I wouldn’t consider 8% of all prospective sales to be a joke, especially since that number only keeps on rising and by the time you’ve spent a few years writing a game it’s likely to be quite a bit more.


Sorry, putting the two things together, my mistake. My router doesn’t let you specify the DNS server directly, but it does allow you to specify a different DHCP server, which can then hand out new IPs with a different DNS server specified, as you say. Bit of a house of cards. DHCP server in order to be the DNS server too.


The router provided with our internet contract doesn’t allow you to run your own firmware, so we don’t have anything so flexible as what OpenWRT would provide.
Short answer; in order to Pi-hole all of the advertising servers that we’d be connecting to otherwise. Our mobile phones don’t normally allow us to choose a DNS server, but they will use the network-provided one, so it sorts things out for the whole house in one go.
Long, UK answer: because our internet is being messed with by the government at the moment, and I’d prefer to be confident that the DNS look-ups we receive haven’t been altered. That doesn’t fix everything - it’s a VPN job - but little steps.
The DHCP server provided with the router is so very slow in comparison to running our own locally, as well. Websites we use often are cached, but connecting to something new takes several seconds. Nothing as infuriating as slow internet.


Big shout out to Windows 11 and their TPM bullshit.
Was thinking that my wee “Raspberry PI home server” was starting to feel the load a bit too much, and wanted a bit of an upgrade. Local business was throwing out some cute little mini PCs since they couldn’t run Win11. Slap in a spare 16 GB memory module and a much better SSD that I had lying about, and it runs Arch (btw) like an absolute beast. Runs Forgejo, Postgres, DHCP, torrent and file server, active mobile phone backup etc. while sipping 4W of power. Perfect; much better fit than an old desktop keeping the house warm.
Have to think that if you’ve been given a work desktop machine with a ten-year old laptop CPU and 4GB of RAM to run Win10 on, then you’re probably not the most valued person at the company. Ran Ubuntu / GNOME just fine when I checked it at its original specs, tho. Shocking, the amount of e-waste that Microsoft is creating.
The problem is that the volume of slop available completely overwhelms all efforts at quality control. Zealotry only goes so far at turning back the tsunami of shite.