Thoughts on Distributed Working

Apple’s march of releasing very efficient and fast computers using their new M1[ Pro/Max] processors has long been leading me dream about how nice it would be to distribute how I do my work to more computers.

Ever since the iPad Pro I’ve been interested in making my setup work primarily remotely. As I document in my garden: The iPad can be setup for doing development and sys-admin work. But it was still limited, because setting up the iPad to do the basic management needed to make it cost efficient1 was too complicated. With the release of the M1 MBA you can have access to all your usual Unix tools (developed and maintained by others…) so you can easily launch, stop, and delete remote compute to use. You have native access to every browser; namely Chrome.

I’m interested in this more than just to be more mobile, but to remove as many concerns from my life as I can. I would rather not deal with the heat, power, and maintenance issues of powerful computers if I can avoid it. And if I did have a powerful computer, I would rather it be something in a Desktop form-factor so I let it be as powerful (and upgradable) as it could be while my portable computer be as portable as it could be.

The Allure of Apple

Apple’s devices are simply some of the best devices ever made. The arrival of the M1 chip has made this even more true for their Laptops, as many had been predicted but seeing it in reality makes it even more incredible. I think most didn’t actually expect these chips to actually BE BETTER. I think most casual observers knew the “iPad Pro chip is faster than laptops” and didn’t really think it was TRULY that much faster. These M1 chips are faster, without active cooling, and absolutely this change is on par with the advent of the iPhone. These laptops are now effectively devices in their own class and it’ll take years before Intel/AMD/Qualcomm/etc. can even mount a serious response to them for manufacturers to design a comparable laptop.

That said, they are still new chips and macOS has been making some big changes between releases which has introduced some growing pains. There have also been recent events where Apple made it difficult for tools like Little Snitch to work correctly in an effort to secure the platform more and there are renewed concerns about the security and privacy of macOS against Apple’s changes.

It’s also worth mentioning that smoothly is not something a lot of people had been describing new Macs for a while now. Between the lack-luster Intel updates, limited RAM options, and not least of all the years of horrible butterfly keyboards have left people extremely bad tastes in their mouths which these computers do a lot to undo, but which they still need to work through.

It was only a few years ago the message between developers was “get a mac or suffer.” It was essentially the only rational business decision. Get the absolutely best laptop which comes with the most well supported environment.

With these new computers things seem to be taking a turn for the better, but there still seems to be a good reason to not fully invest in Apple’s ecosystem incase things take a bad turn in the future.

Just someone else’s computer…

While Apple’s computers are amazingly powerful, the compute/networks/storage of cloud providers are also extremely cost competitive. And with very little money up front. How often are you really using that 12900K + 3080 GPU? How often are you actually pushing that MacBook Pro to the max? If it’s often then yes it obviously makes sense to buy one (they’re still awesome computers), but if the work could be easily done in a remote computer why not?

Our computers often get filled up with crud over time we do not want or need. Some even wipe and recreate their computers on a regular basis ( Or delete everything on every reboot). It would be nice to have a clean slate to work with on a regular basis which you could easily get from a cloud provider and keep your local machine focused on whatever makes you productive.

So paired with a powerful Mac where we can run scripts to manage things we have a pretty robust way to get access to compute we need to get our work done.

I am glad that these ideas are not particularly original because Github just made available for general use its Code Spaces feature which does a lot of the work I had issues configuring on my iPad for you. It’s also quite cost competitive and solves a lot of the setup issue I’d hoped NixOS would help me with.

This is a holy grail for many projects to simplify things and let us use cheaper and weaker devices when we want.

But, in the end it doesn’t need to be a service or server in the Cloud. You could always just build a powerful PC, setup Linux on it, and install Tailscale to easily use at home and on the go.

Modern Thin Clients

And basically that’s what I am doing. I have bought a M1 MBA which I love and does 80-90% of what I need. I am also building a powerful Desktop with an Intel 12900k, 64GB of DDR5 memory2, and a 3080ti to handle all my heavy-duty needs.

Most things probably won’t even require that much which I’ll deal with either on the MBA itself or in a server running in some cloud provider. I’ve really enjoyed using so far so I think that’ll be the first one I reach for in solving things.

  1. Or at least it felt that way. Outside creating a dedicated app for doing these things you have Scriptable and Pythonista for running local code to talk to AWS/etc. but you’d need to maintain it. Working Copy and ShellFish both support launching and managing Digital Ocean servers, but I needed/wanted AWS. Another option was carrying a specially configured Raspberry PI to run the scripts/programs needed (or doing whatever you needed the remote computer for). You could also connect to something low-power via Tailscale first to run the commands or apply changes via Terraform then connect to the new instance(s). Another idea I had was using things like Github PRs and CI/CD tools, since Working Copy and the GitHub Apps work so well, to manage your personal infrastructure, but it felt like too much work for me to get an instance launched quickly. I didn’t want to make that my daily workflow. I have also thought of how I might use route53 and lambdas to do something similar to this on-demand minecraft setup, but again it just felt so complicated and I never got around to it. I had hoped NixOS would help automate the setup of the remote servers so I could quickly launch new instances, but it’s taken longer to learn and setup that I’d like. ↩︎

  2. If the memory would just ship already! Darn shortages ↩︎