After a long time I’m in a situation where I sometimes work on a temporary system without my individual setup. Now whenever I might add a new custom (nushell) command that abstracts the usage of CLI tools, I think about the loss of muscle memory/knowledge for these tools and how much time I waste looking them up without my individual setup. No, that’s not a huge amount of time, but just out of curiosity I’d like to know how I can minimize this problem as much as possible.
Do you have some tips and solutions to handle this dilemma? I try to shadow and wrap existing commands, whenever it’s possible, but that’s often not the case. Abbreviations in fish are optimal for this problem in some cases, but I don’t think going back to fish as my main shell for this single reason would be worth it.
I have a public git repository that I keep those kinds of recipes in.
So on a temporary system, I usually clone that repository first, so I can reuse past solutions.
Me, too, and it works for other Linux distros, but in this case it’s a Windows Sandbox. Unless it’s copy and paste, for this case it wouldn’t be worth it and I assume there can be similar situations in the future for other reasons.
I once started to work on auto-setup scripts for Windows, but the unpredictable nature of it made me give up on that :D
Yeah. This still sucks, but is getting substantially better every year. My lazy rule of thumb is if I find a solution inside of WMI (Windows Management Interface), then I’ll script it. Otherwise, I figure I’m wasting my time as it will change anyway.
If it’s Windows 10 or later,
winget
is preinstalled (sort of / mostly) and has acess to a release ofgit
. (WinGet is available on ‘Modern’ Windows 10 and later., and it may take a few minutes to bootstrap itself after first login.)So I’m able to bootstrap this pattern on Windows with something like:
Syntax from Stack Overflow
I’m pretty sure I just use
winget install Git.Git
, but someone on SO recommends the above longer version. I’m guessing it prevents an interactive prompt, since there are more than one package source forgit
, if I recall.You may be happily surprised - we don’t agree on much in technology, but bootstrapping with
git
is supported in places where nothing else works, and is finally also even popular among Windows engineers.I recall encountering two exceptional cases:
Batocera
.In both cases, I still version the relevant scripts in the same git repository, but I end up getting scrappy for deploying them.
On an immutable distribution, I’ll
curl
,wget
, orInvoke-WebRequest
to get a copy of each file I need, as I need them. I encounter this often enough that I find it worth putting copies into a public S3 bucket with a touch of nice DNS in front. It does wonders for me remembering the correct path to each file.On a completely offline distribution, I run
git init --bare
in a folder on a the root of a thumb drive or network share, and then Igit push
a shallow copy of my scripts repo to it, andgit clone
from it on the machine to work on. I also simply file copy a copy as well, in case I cannot getgit
bootstrapped on the offline machine.I do still bother with the
git
version because I invariably need to make a tiny nuanced script correction, and it’s so much easier (for my work patterns) to sync it back later withgit
.