• 2 Posts
  • 56 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle
  • I reckon it’s simpler than that. Zuckerberg has never really invented anything novel; Facebook was a straight clone of a whole bunch of competing social media sites (which just so happened to win the numbers war), and WhatsApp and Instagram were both acquisitions.

    I think the Metaverse was Zuckerberg trying to prove to himself and others that he and he personally could come up with the “next big thing”. The fact that he came up with something which absolutely no-one wanted (and most people barely understood) is a testament to why he never came up with anything ground breaking before, too.


  • Ads and monetization have ruined the internet compared to what it was. Early Internet was completely without ads, and things were run by people who were actually interested in the content presented, not in profits.

    How early are we talking here? If you mean pre-Web, in the Usenet era it was standard practice to pay a subscription to join a Usenet server. If you mean the early Web, ads were already everywhere by the mid-90s.


  • Yeah, I mean if you want to get picky, the actual i386 processor family hasn’t been supported by the Linux kernel since 2012, and was dropped by Debian in 2007.

    Most people were generally not particularly affected by that, seeing as the last i386 chip was released in (I think) 1989!

    Debian’s choice to refer to the whole x86-32 line as i386 has always been a weird historical quirk.



  • Projects which choose BSD/Apache type licences do so fully in the knowledge that their code may be incorporated into projects with different licences. That’s literally the point: it’s considered a feature of the licence. These projects are explicitly OK with their code going proprietary, for example. If they weren’t OK with it, they’d use a GPL-type copyleft licence instead, as that’s conversely the literal point of those licences.

    Being mad about your Apache code being incorporated into a GPL project would make no sense, and certainly wouldn’t garner any sympathy from most people in the FOSS community.






  • The poor devs aren’t even saying “no”. They’re just saying “what the hell is going on and why didn’t you ask us about this first”.

    Pretty poor form for the OP to use a “KDE Developer”-badged account when they didn’t have any backing from the KDE developers to make the post. Makes it look a lot more official than it actually is.



  • I’ve seen a ton of posts bashing arch and commenters pretty much calling it a “good for nothing distro”, with the only more hated distro being Manjaro.

    All distros have their little hate-clubs. Try being an Ubuntu user! Or Debian (“why are all the packages so old!”), or Fedora (“ew, Red Hat”), or Gentoo (“is that a laptop or a space heater?”) or…er, openSUSE (now I come to think of it, does anybody actually hate SUSE?). You get the idea, anyway. People get super weird and fanboyish about distros.

    I don’t think arch has it any worse than the rest.


  • Patch@feddit.uktoLinux@lemmy.mlBased KDE 🗿
    link
    fedilink
    arrow-up
    36
    ·
    edit-2
    1 year ago

    I’ve been a Linux user for a decade and a half now, but still use Windows on my corporate laptops. Honestly, it’s baffling how Microsoft seem to consistently manage to miss the mark with the UI design. There’s lots to be said about the underlying internals of Windows vs Linux, performance, kernel design etc., but even at the shallow, end user, “is this thing pleasant to use” stakes, they just never manage to get it right.

    Windows 7 was…fine. It was largely inoffensive from a shell point of view, although things about how config and settings were handled were still pretty screwy. But Windows 8 was an absolutely insane approach to UI design, Windows 10 spent an awful lot of energy just trying to de-awful it without throwing the whole thing out, and Windows 11 is missing basic UI features that even Windows 7 had.

    When you look at their main commercial competition (Mac and Chromebook) or the big names in Linux (GNOME, KDE, plenty of others besides), they stand out as a company that simply can’t get it right, despite having more resources to throw at it than the rest of them put together.


  • It’s not a “shitty title”, because Ubuntu Linux is the thing they actually tested.

    Whether Debian or Fedora or Alpine or Void or whatever would do better or worse is not a given, and isn’t something the OP can comment on because they didn’t test it.

    We can probably infer that gains of a similar amount would be seen on most mainstream distros (as they’re all pretty similar under the covers), but that’s not on the OP.

    In particular, Ubuntu ships with various non-free drivers and kernel patches that will be present in some, but not all other distros.



  • ChromeOS is Linux, and it has pretty decent penetration.

    And I know what you’re going to say: “But ChromeOS isn’t proper Linux”. But it’s a desktop OS based on Gentoo, built on the Linux kernel and, GNU coreutils and bash (although not GCC, as far as anyone can tell). It certainly has all the hallmarks of being GNU/Linux (or something very close to it).

    The fact that it doesn’t really resemble any “mainstream” Linux distro is kind of the point. It’s a locked down corporate product with a minimalist front-end locked into a bunch of commercial web services, and that’s exactly the kind of device that sells volumes.

    Mainstream Linux is a tough sell. It was a tough sell 15 years ago when PCs were still the king of personal computing. In the post-smartphone, post-iPad world which we’re in now, we have to accept that that’s never going to be the device your grandma uses to check her email.

    Plenty of Linux distros aren’t just volunteer-based, and are instead made and supported by for-profit companies. Red Hat/Fedora is made by the big blue, IBM themselves; it doesn’t get much bigger than that. Ubuntu, SUSE, Manjaro, all for-profit commercial outfits. None of these are failures, it’s just that their products aren’t targeting the market for cheap commercial laptops. You can buy Ubuntu preloaded on a laptop from Dell or Lenovo, but they’re targeting IT professionals and data scientists and people who work with Linux servers. Or they’re targeting fleet deployments of 100s of devices in municipal organisations. There’s a good market there, it’s just a different market.


  • Alright, I’ve just looked up both code repositories. You’re right, the first tagged version of snapd was committed one month before the first tagged version of Flatpak.

    Snap is quite a bit older than that; its original codebase was released as “click”, which was part of the Ubuntu Touch project; it’s a project with a fairly long history.

    Flatpak’s roots come from OSTree, which has its own depth of history, but the idea to use that to create a containerised packaging format came after clicks and Appimages (and their forerunners) were already on the scene.

    Again, not a criticism of flatpaks. On the contrary, it shows that being the latecomer doesn’t mean you can’t be the winner.

    Unity was neither revolutionary (looked the same as Gnome), nor usable (it was slow af).

    Ubuntu had their own motivation for Unity, which was their at the time focus on full device convergence. That is, a single DE on PCs, smartphones, smart TVs and kiosks. It was something which wasn’t on the cards for GNOME and which was made clear was not going to be a design focus, and there really wasn’t (and still isn’t) any other DE that was built with that focus in mind.

    Of course it didn’t work out. Partly because Canonical never had any success marketing Ubuntu Touch (on phones and tablets) or Ubuntu TV; partly because they were never able to get Unity to a place where it worked in that way (the never-released Unity 8, now Lomiri, was due to be the big pay off, but it was stuck in development hell). Canonical pulled the plug on it because it was haemorrhaging money on it and they desperately needed to get back in the black.

    But honestly, there’s as much legitimate reason for pursuing that as there was in any of the others. COSMIC being written in Rust isn’t revolutionary; Rust is great, but it’s just a memory-safe C-family language. It’s a fine choice to write a new DE in, but the benefits are mostly on the side of the developer than the user.

    Well, what I meant was Mir as a display server, but you got the point. Now they turned it into a Wayland compositor. Cool, but then why not do a favor to the open source community and contribute to wlroots instead?

    Canonical’s main focus has been contributing to Mutter rather than Mir. Mir’s usecase is really more for kiosks, signage and thin client devices (where it’s the guts of Ubuntu Frame); although it’s possible to build something like that in wlroots, nobody really has yet. And in any case, I take issue with:

    why not do a favor to the open source community and contribute to wlroots instead?

    Mir and Ubuntu Frame are open source, and since when have we required the FOSS world to be monolithic around one solution? We have multiple DEs, multiple browsers, multiple office suites and email clients, heck whole selections of different FOSS OSs. The variety, competition, and ability to choose is kinda the whole point. If Canonical think they can do a better job with Ubuntu Frame kiosk software with Mir, they can have at it.




  • I’ve never understood why people run without swap. There’s basically no downside to having it. If you’re running a high spec, high RAM machine you probably also have a big SSD/HDD and are very unlikely to be squeezing it to the last GB (and if you are you should probably look into upgrading that). And if you’re on a machine with very limited SSD/HDD capacity, you’re probably not in an “ample RAM” situation anyway.

    Even on high RAM systems, a few GB of swap can enable better caching and more graceful memory management. But heck, even if the thing sits there like an 8GB lump of nothing, were you really going to miss that last 8GB?