We really can vote with our dollars. The issue is that we don’t (I’m point right at myself here).
Don’t buy the things, we probably don’t need em.
We really can vote with our dollars. The issue is that we don’t (I’m point right at myself here).
Don’t buy the things, we probably don’t need em.
SMB : https://en.m.wikipedia.org/wiki/Server_Message_Block
In short it’s a way to share network access to storage across MacOS/Linux/Windows.
MacOS switched from AFS to SMB (as the default file sharing / network storage protocol) a few years ago as it was clear that was how everything was headed - though iOS and MacOS also have native support for NFS.
On linux, you can use samba to create SMB shares that will be available to your iOS device.
It’s a lot of configuration though - so maybe not the best choice.
As for Nextcloud - indeed you can use it in your local network without making it available on your WAN connection. That’s how we use it here.
When we need it remotely - we VPN into our home network. But no exposed ports. :)
I use Nextcloud. But that also means setting up and managing Nextcloud. By the same token you could use google drive.
For notes and photos you can export them within the app. Notes specifically requires that you print and then hit the share on the print dialogue to save the notes to the file system as a pdf.
Notes also has another option: if you have a non-Apple mail account on your phone - you can enable notes for that email account and simply move (or copy) your notes from one account to the other. The notes will then become available within that email account mailbox structure on any device or machine where that email account is enabled.
For voice recordings you can save any voice recording directly to the iOS filesystem.
The iOS files app also allows you to connect to any other server/desktop via SMB.
There are lots of options here. None are awesome, but they work.
While you’re technically right, I don’t see a material difference between paying with cash and paying with data (Verge sign up is free, but it’s still sign up).
I don’t think it will be that cut and dry.
A huge number of tech companies are still and/or will always be fully remote.
Over time, the big pay checks that Meta and Google and Apple are offering will be overshadowed by the possibilities of remote work done right (as opposed to simply working as you are in the office but from home).
There are lots of smart, talented folks out there willing to take a pay cut to gain back the time that office culture can waste, commuting first of all.
Sure there are challenges to the sense of togetherness that can help build great teams, but plenty of remote-only organizations make the time and space to foster that appropriately.
Ultimately, I think we’ll find that the eventual competitors to the MAANG-like behemoths emerge out of smart, well designed, remote-first organizations. Though I think Netflix is largely remote - at least for the engineers I know who work there.
Grateful that they don’t. But they have tried to do it with podcasts.
Spotify “pulled an Apple”, bought Gimlet and moved all their podcasts onto Spotify exclusively. I don’t use Spotify and chose to find alternatives. I’m happy I did.
deleted by creator
If you like sweet BBQ sauces, Blues Hog original is wonderful.
My family thinks I have a secret rib recipe and it’s just a thin coat of Blues Hog original near the end of the cook.
I only found the sauce because a local BBQ place was selling it and I thought I’d try something new.
The only way to ensure privacy is something like PGP. Encrypt before you send. Heck you could even encrypt before you put the contents into a message body.
With self hosted, the messages themselves aren’t encrypted at rest and they are clear text between hops even if those hops support TLS in transit.
Ultimately the right answer for you will hinge on what your definition and level of privacy is.
I second this.
It’s going to be hard. If the recruiter/TA Specialist is good at their job they’ll try to get you to give a “ballpark.” They’ll do anything to try to figure out the lowest offer they can make.
Do not give in.
Hold firm and ask what their offer is and go from there.
In one case their offer was double what I was expecting. It changed my life.
In other, their offer was just slightly under what I was expecting and I got what I hoped for with little effort and only a single back and forth.
There is one exception here: if they really want you and you are ABSOLUTELY sure you’re out of their salary band for the position, you can wield your salary demands like a sword. I recently used my expected salary (which I knew the company wouldn’t match) to negotiate a 4-day work week at their full time pay, with an extra week of vacation tacked on for good measure. Win win.
It’s smokeless. We had lots of mosquito-filled nights before the fan made its appearance.
This is real.
We setup a largish fan outside near our fire pit, attached to an inverter powered by a power tool battery.
It dramatically reduced the mosquitos. A few will still make but for the most past it solved the issue.
Snaps I get, but Ubuntu? Aside from an asinine application process to get hired a Canonical, they did a lot to push for a more straightforward Linux desktop experience. Their time has passed, but cancer is a bit too much for me, considering all the fantastic offshoots.
Context: I came to Ubuntu from Gentoo. Debian before that and a brief flirt with the hot fantastic mess that was Mandrake when I first discovered Linux.
Welcome to enstone.
You’re conferring a level of agency where none exists.
It appears to “understand.” It appears to be “knowledgeable. “
But LLMs do neither of those things.
Take this note from an OpenAI dev:
It’s that these models have leveraged so much data they’ve been able to map out relationships between words (or images) in way as to be able to generate what seem like new versions of those things.
I grant you that an LLM has more base level knowledge than any one human, but again this is thanks to terrifyingly large dataset and a design that means it can access this data reasonably reliably.
But it is still a prediction model. It just has more context, better design and (most importantly) data to make predictions at a level never before seen.
If you’ve ever had a chance to play with a model at level where you can control some of its basic parameters it offers a glimpse into just how much of a prediction machine it can be.
My favourite game for a while was to give midjourney a wildly vague prompt but crank the chaos up to 100 (literally the chaos flag at the highest level) to see what kind of wild connections exist but are being filtered out during “normal” use.
The same with the GPT-3.5 API in the “early days” - you could return multiple versions of the response and see the sausage being made to a very small degree.
It doesn’t take away from the sense of magic using these tools. It just helps frame what’s going on under the hood.