• 1 Post
  • 207 Comments
Joined 4 years ago
cake
Cake day: May 26th, 2021

help-circle







  • Would self-hosting a Nextcloud instance locally without an internet connection be viable?

    Yes, that should be no problem. Some nextcloud plugins might require a domain name and/or a https connection. If you can’t use your own DNS server you can change the clients hosts-file. Iirc docker and snap versions of nextcloud will auto-generate self signed ssl certificates and you can use a reverse-proxy if they don’t. (e.g. use caddy, it will automatically generate certificates) You won’t be able to update regularly, so you should only let trusted users into your network.

    Idk about gitlab, but I’m sure you can run gitea offline, if you don’t need any of the fancy gitlab features. (It’s faster too+ you can set up gitea to login with the nextcloud account.)


  • Websites usually use transport encryption but the password itself isn’t encrypted. There are authentication schemes that won’t send plaintext passwords (by involving some kind of challenge) but they won’t work without javascript (except http digest access authentication but thats no good) and you shouldn’t ask web-developers to implement them since they will find a way to fuck it up.




  • Germany closed down all their nuclear plants

    That’s wrong: 3 plants are still running (probably) until the end of this year.

    the end result was that they just started using more fossil fuels.

    This is true, but the reason isn’t the lack of alternatives but incompetent and corrupt state and federal government. They sabotaged the domestic solar sector, they made running private (roof-) solar plants unnecessarily complicated, they made building new (on-shore) wind parks basically impossible and they blocked the extension of the electrical grid. (And thats just the stuff I remember from the top of my head)






  • Find should already be installed but depending on how the files are named ls should do.

    You probably want to do something similiar to this snippet:

    # Create a temporary directory and save it's path
    TMPD=$(mktemp -d)
    
    # Extract the archive to that directory
    7z x -o$TMPD $1
    
    # Convert the images to PDF
    img2pdf $(ls -vd $TMPD/**) -o $2
    
    # Delete the temporary directory
    rm $TMPD
    

  • doing this on Python or whatever language?

    If you’re on Linux or MacOS doing this with a bash script would be the easiest imo, since there is ready-made software for all the steps.

    • Unpacking: unzip, unrar, 7z (the latter can handle many different formats)
    • Sorting the images: find
    • Converting them to pdf: img2pdf, imagemagick

    You can even automate the downloading with e.g. wget.

    On windows you could (probably) do the same with a python script (or maybe powershell idk)

    Are email requirement the default for lemmy instances or it is something an operator has to choose?

    That can be configured.



  • It’s mostly no replies, deleted comments (why tf do people do that?!), people guessing (which isn’t that bad, but why don’t they disclose it?) or just pure bullshitting. (The latter I have no problem with, but maybe don’t do it in threads where someone needs help?!)

    There’s so much good content out their, but search engines seem to actively punish sites that don’t have tracking, bad usability and megabytes of useless javascript.