• 0 Posts
  • 448 Comments
Joined 5 years ago
cake
Cake day: February 15th, 2021

help-circle

  • SIM card is absolutely required even for emergency services

    For anyone wondering: while technically the cell towers might be able to accept emergency calls even without network authentication (which is what’s the SIM is for), there are countries/places that will still require an active SIM with the excuse of wanting to prevent hoax calls.




  • The only reason for CSD is touch interfaces on small screens.

    Even in this case I’d argue that on small screens most apps simply have no real decorations (not even client-side)… there’s typically not even a close button. Hamburger buttons are menus, which isn’t what’s typically considered “decoration”. One could argue that the bar at the bottom in Android with home/back/etc controls is effectively a form of SSD. Android offers system UI or gestures to send the app to the background (ie. minimize) or closing it, it does not require Apps to render their own, which is effectively what Gnome is asking with CSD.


  • They justify the rejection of SSD because it isn’t part of the core Wayland protocol and at the same time push client apps for the “minimize” and “maximize” buttons (along with respecting some settings) despite it also not being part of the core protocol and it being only possible through extensions. There’s a ton of tiling compositors that don’t even have any concept of minimize/maximize, so why should this be required of every client app?

    It feels backwards to ask the app developers to be the ones adding the UI for whatever features the window compositor might decide to have. They might as well be asking all app developers to add a “fullscreen” button to the decoration, or a “sticky” button, or a “roll up”/“shade” button like many old school X11 WM used to have. This would lead to apps lagging behind in terms of what they have implemented support for and resulting in inconsistent UX, and at the same time limiting the flexibility and user customization of the decorations, not just in terms of visuals but also function and behavior.


  • LLMs abstract information collected from the content through an algorithm (what they store is the result of a series of tests/analysis, not the content itself, but a set of characteristics/ideas). If that makes it derivative, then all abstractions are derivative. It’s not possible to make abstractions without collecting data derived from a source you are observing.

    If derivative abstractions were already something that copyright can protect then litigants wouldn’t resort to patents, etc.


  • You are not gonna protect abstract ideas using copyright. Essentially, what he’s proposing implies turning this “TGPL” in some sort of viral NDA, which is a different category of contract.

    It’s harder to convince someone that a content-focused license like the GPLv3 protects also abstract ideas, than creating a new form of contract/license that is designed specifically to protect abstract ideas (not just the content itself) from being spread in ways you don’t want it to spread.





  • Ah, I see. Sorry, the text was too long and I’m not dutch so it was hard to spot that for me too.

    But I interpret that part differently. I think them saying that there’s an ambiguous section about risks does not necessarily mean that the ambiguity is in the responsibility of those who choose to not implement the detection… it could be the opposite: risks related to the detection mechanism, when a service has chosen to add it.

    I think we would need to actually see the text of the proposal to see where is that vague expression used that she’s referring to.



  • Thanks for the link, and the clarification (I didn’t know about april 2026)… although it’s still confusing, to be honest. In your link they seem to allude to this just being a way to maintain a voluntary detection that is “already part of the current practice”…

    If that were the case, then at which point “the new law forces [chat providers] to have systems in place to catch or have data for law inforcements”? will services like signal, simplex, etc. really be forced to monitor the contents of the chats?

    I don’t find in the link discussion about situations in which providers will be forced to do chat detection. My understanding from reading that transcript is that there’s no forced requirement on the providers to do this, or am I misunderstanding?

    Just for reference, below is the relevant section translated (emphasis mine).

    In what form does voluntary detection by providers take place, she asks. The exception to the e-Privacy Directive makes it possible for services to detect online sexual images and grooming on their services. The choice to do this lies with the providers of services themselves. They need to inform users in a clear, explicit and understandable way about the fact that they are doing this. This can be done, for example, through the general terms and conditions that must be accepted by the user. This is the current practice. Many platforms are already doing this and investing in improving detection techniques. For voluntary detection, think of Apple Child Safety — which is built into every iPhone by default — Instagram Teen Accounts and the protection settings for minors built into Snapchat and other large platforms. We want services to take responsibility for ourselves. That is an important starting point. According to the current proposal, this possibility would be made permanent.

    My impression from reading the dutch, is that they are opposing this because of the lack of “periodic review” power that the EU would have if they make this voluntary detection a permanent thing. So they aren’t worried about services like signal/simplex which wouldn’t do detection anyway, but about the services that might opt to actually do detection but might do so without proper care for privacy/security… or that will use detection for purposes that don’t warrant it. At least that’s what I understand from the below statement:

    Nevertheless, the government sees an important risk in permanently making this voluntary detection. By permanently making the voluntary detection, the periodic review of the balance between the purpose of the detection and privacy and security considerations disappears. That is a concern for the cabinet. As a result, we as the Netherlands cannot fully support the proposal.



  • Where is this explained? the article might be wrong then, because it does state the opposite:

    scanning is now “voluntary” for individual EU states to decide upon

    It makes it sound like it’s each state/country the one deciding, and that the reason “companies can still be pressured to scan chats to avoid heavy fines or being blocked in the EU” was because of those countries forcing them.

    Who’s the one deciding what is needed to reduce “the risks of the of the chat app”? if it’s each country the ones deciding this, then it’s each country who can opt to enforce chat scanning… so to me that means the former, not the latter.

    In fact, isn’t the latter already a thing? …I believe companies can already scan chats voluntarily, as long as they include this in their terms, and many do. A clear example is AI chats.




  • the local sending side has some way to control the state their particle wavefunctions collapse into (otherwise they’re just sending random noise).

    Do they? My impression is that, like the article says, “their states are random but always correlated”. I think they are in fact measuring random values on each side, it’s just that they correlate following Schroedinger’s equation.

    I believe the intention is not “sending” specific data faster than light… but rather to “create Quantum Keys for secure information transmission”. The information between the quantum particles is correlated in both sides, so they can try to use this random data to generate keys on each side in a way that they can be used to create a secure encryption for communication (a “Quantum Network that will be used for secure communication and data exchange between Quantum Computers”), but the encrypted data wouldn’t travel faster than light.