This allows me to rapidly iterate on shell pipelines. The main goal is to minimize my development latency, but it also has positive effects on dependencies (avoiding redundant RPC calls). The classic way of doing this is storing something in temporary files:
up(1) looks really cool, I think I'll add it to my toolbox.
It looks like up(1) and memo(1) have similar use cases (or goals). I'll give it a try to see if I can appreciate its ergonomics. I suspect memo(1) will remain my mainstay:
1. After executing a pipeline, I like to press the up arrow (heh) and edit. Surprisingly often I need to edit something that's *not* the last part, but somewhere in the middle. I find this cumbersome in default line editing mode, so I will often drop into my editor (^X^E) to edit the command.
2. Up seems to create a shell command after completion. Avoiding the creation of extra files was one of my goals for memo(1). I'm sure some smart zsh/bash integration could be made that just returns the completed command after completing.
Another thing I built into memo(1) which I forgot to mention: automatic compression. memo(1) will use available (de)compressors (in order of preference: zstd, lz4, xz, gzip) to (de)compress stored contents. It's surprising how much disk space and IOPS can be saved this way due to redundancy.
I currently only have two memoized commands:
$ for f in /tmp/memo/aktau/* ; do
ls -lh "$f" =(zstd -d < $f)
done
-rw-r----- 1 aktau aktau 33K /tmp/memo/aktau/0742a9d8a34c37c0b5659f7a876833b6dad9ec689f8f5c6065d05f8a27d993c7bbcbfdc3a7337c3dba17886d6f6002e95a434e4629.zst
-rw------- 1 aktau aktau 335K /tmp/zshSQRwR9
-rw-r----- 1 aktau aktau 827 /tmp/memo/aktau/8373b3af893222f928447acd410779182882087c6f4e7a19605f5308174f523f8b3feecbc14e1295447f45b49d3f06da5da7e8d7a6.zst
-rw------- 1 aktau aktau 7.4K /tmp/zshlpMMdo
The default storage location for memo(1) output is /tmp/memo/${USER}. Most distributions either have some automatic periodic cleanup, and/or wipe it on restart.
Separately from that:
- The invocation contains *memo* right in there, so you (the user) knows that it might memoize.
- One uses memo(1) for commands that are generally slow. Rerunning your command that has a slow part and having it return in a millisecond while you weren't expecting it should make the spider-sense tingle.
In practice, this has never been a problem for me, and I've used this hacked together command for years.
I use Warp terminal for couple of years, and recently they embeeded AI into it. At first I was irritated, disabled it, but AI Agent is built in as an optional mode (Cmd-I to toggle). And I found myself using it more and more often for commands that I have no capacity or will to remember or dig through the man pages (from "figure out my IP address on wifi interface" to "make ffmpeg do this or that"). It's fast and can iterate over own errors, and now I can't resist using it regularly. Removes the need for "tools to memorize commands" entirely.
i see no way to name the memo in your examples, so how do you refer to them later?
also, this seems a lot like an automated way to write shell scripts that you can pipe to and from. so why not use a shell script that won't surprise anyone instead of this, which might?
> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.
The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.
> jsonformat takes JSON at stdin and pretty-prints it to stdout.
Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
> uuid prints a v4 UUID. I use this about once a month.
Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
> The best part about sharing your config or knowledge is that someone will always light up your blind spots.
Yes! I will take this as a chance to thank every people who shared their knowledge on the Internet. You guys are so freaking awesome! You are always appreciated.
A big chunk of my whole life's learning came from all the forums that I used to scour through, hours after hour! Because these awesome people always sharing their knowledge, and someone adding more. That's what made Internet, Internet. And all is now almost brink of loss, because of greedy corporates.
This habit also helped me with doom-scrolling. I sometimes, do doomscroll, but I can catch it quickly and snap out of it. Because, my whole life, I always jumped in to the rabbit holes, and actually read those big blog posts, where you had those `A-ha` moments, "Oohh, I can use that", "Ahh, that's clever!".
When, browsing, do not give me that, by brain actually triggers, "What are you doing?"
Later, I got lazy, which I am still paying for. But I am going to get out of it.
Never stop jumping into those rabbit holes!! Well, obviously, not always it's a good rabbit hole, but you'll probably come out wiser.
That seems to be especially true on HN. Other forums there is some of that as well, but HN it seems nearly every single comment section is like 75% (random number) pointing out faults in the posted article.
Although I normally loathe pedantic assholes, I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).
I've found that pedantic conversations here seem to actually have a greater potential for me to learn something from them than other forums/social platforms. On other platforms, I see someone providing a pedantic response and I'll just keep moving on, but on HN, I get curious to not only see who wins the nerd fight, but also that I might learn at least one thing along the way. I like that it's had an effect on how I engage with comment sections.
I have showdead on, and almost every single flagged post I've seen definitely deserves it. Every time it wasn't "deserved", the person simply took an overly aggressive tone for no real reason.
In short, I've never seen somebody flagged simply for having the wrong opinion. Even controversial opinions tend to stay unflagged, unless they're incredibly dangerous or unhinged.
I've seen a few dead posts where there was an innocent misunderstanding or wrong assumption. In those cases it would have been beneficial to keep the post visible and post a response, so that readers with similarly mistaken assumptions could have seen a correction. Small minority of dead posts though. They can be vouched for actually but of course this is unlikely to happen.
I agree that most dead posts would be a distraction and good to have been kept out.
It’s a blunt tool, but quite useful for posts. I read most dead posts I come across and I don’t think I ever saw one that was not obviously in violation of several guidelines.
OTOH I don’t like flagging stories because good ones get buried regularly. But then HN is not a great place for peaceful, nuanced discussion and these threads often descend into mindless flame wars, which would bury the stories even without flagging.
So, meh. I think flagging is a moderately good thing overall but it really lacks in subtlety.
> I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).
Can you back this up with data? ;-)
I see citations and links to sources about as little as on reddit around here.
The difference I see is in the top 1% comments, which exist in the first place, and are better on average (but that depends on what other forums or subreddits you compare it to, /r/AskHistorians is pretty good for serious history answers for example), but not in the rest of the comments. Also, less distractions, more staying on topic, the joke replies are punished more often and are less frequent.
True true, one of my favorite things is watching the shorts on home improvement or 'hacks' and sure enough there is always multiple comments saying why it won't work and why its not the right way. Just as entertaining as the video.
also possible (even though I've seen the author's response to not knowing) is that the scripts were written before native was included. at that point, the muscle memory is just there. I know I have a few scripts like that myself
> Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
That was my thought. I use jq to pretty print json.
What I have found useful is j2p and p2j to convert to/from python dict format to json format (and pretty print the output). I also have j2p_clip and p2j_clip, which read from and then write to the system clipboard so I don't have to manually pipe in and out.
> Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
I also made a uuid, which just runs uuidgen, but then trims the \n. (And maybe copied to clipboard? It was at my old job, and I don't seem to have saved it to my personal computer.)
Other examples where native features are better than these self-made scripts...
> vim [...] I select a region and then run :'<,'>!markdownquote
Just select the first column with ctrl-v, then "i> " then escape. That's 4 keys after the selection, instead of 20.
> u+ 2025 returns ñ, LATIN SMALL LETTER N WITH TILDE
`unicode` is widely available, has a good default search, and many options.
BTW, I wonder why "2025" matched "ñ".
unicode ñ
U+00F1 LATIN SMALL LETTER N WITH TILDE
UTF-8: c3 b1 UTF-16BE: 00f1 Decimal: ñ Octal: \0361
> catbin foo is basically cat "$(which foo)"
Since the author is using zsh, `cat =foo` is shorter and more powerful. It's also much less error-prone with long commands, since zsh can smartly complete after =.
I use it often, e.g. `file =firefox` or `vim =myscript.sh`.
`trash` is good to know, thanks! I'd been doing: "tell app \"Finder\" to move {%s} to trash" where %s is a comma separated list of "the POSIX file <path-to-file>".
I believe it would be possible to execute an applescript to tell the finder to delete the files in one go. It would theoretically be possible to construct/run the applescript directly in a shell script. It would be easier (but still not trivial) to write an applescript file to take the file list as an argument to then delete when calling from the shell.
It’s not theoretical, and it is trivial. Like I said, I did exactly that for years. Specifically, I had a function in my `.zshrc` to expand all inputs to their full paths, verify and exclude invalid arguments, trash the rest in one swoop, then show me an error with the invalid arguments, if any.
Instead of trash, reimplementing rm (to only really delete after some time or depending on resource usage or to shred of you are paranoid if the goal is to really delete something) or using zfs makes much more sense.
> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.
Instead of being rude to a fellow human making an inoffensive remark, you could’ve spent your words being kind and describing the scenario you claim exists. For all you know, maybe they did ask ChatGPT and were unconvinced by the answer.
As a side note, I don’t even understand how your swipe would make sense. If anything, needing ChatGPT is what demonstrates a lack of imagination (having the latter you don’t need the former).
The trash command for macOS that's being talked about above is native in the OS now, since v14 according to its manpage, though I see it may have really been v15[1]
I've written on this before, but I have an extensive collection of "at" scripts. This started 25+ years ago when I dragged a PC tower running BSD to a friend's house, and their network differed from mine. So I wrote an @friend script which did a bunch of ifconfig foo.
Over time that's grown to an @foo script for every project I work on, every place I frequent that has some kind of specific setup. They are prefixed with an @ because that only rarely conflicts with anything, and tab-complete helps me remember the less frequently used ones.
The @project scripts setup the whole environment, alias the appropriate build tools and versions of those tools, prepare the correct IDE config if needed, drop me in the project's directory, etc. Some start a VPN connection because some of my clients only have git access over VPN etc.
Because I've worked on many things over many years, most of these scripts also output some "help" output so I can remember how shit works for a given project.
Edit: a word on aliases, I frequently alias tools like maven or ansible to include config files that are specific to that project. That way I can have a .m2 folder for every project that doesn't get polluted by other projects, I don't have to remember to tell ansible which inventory file to use, etc. I'm lazy and my memory is for shit.
Slightly related but mise, a tool you can use instead of eg make, has “on enter directory” hooks that can reconfigure your system quite a bit whenever you enter the project directory in the terminal. Initially I was horrified by this idea but I have to admit it’s been quite nice to enter into a directory and everything is set up just right, also for new people joining. It has built in version management of just about every command line tool you could imagine, so that an entire team can be on a consistent setup of Python, Node, Go, etc.
I see other people mentioning env and mise does this too, with additional support to add on extra env overrides with a dedicated file such as for example .mise.testing.toml config and running something like:
MISE_ENV=testing bun run test
(“testing” in this example can be whatever you like)
I'm stealing the top comment here because you probably know what I'm asking.
I've always wanted a linux directory hook that runs some action. Say I have a scripts dir filled with 10 different shells scripts. I could easily have a readme or something to remember what they all do.
What I want is some hook in a dir that every time I cd into that dir it runs the hook. Most of the time it would be a simple 'cat usage.txt' but sometimes it maybe 'source .venv/bin/activate'.
I know I can alias the the cd and the hook together but I don't want that.
Its intended use case is loading environment variables (you could use this to load your virtualenv), but it works by sourcing a script — and that script can be ‘cat usage.txt.’
Great tool.
If you use Emacs (and you should!), there’s a direnv mode. Emacs also has its own way to set configuration items within a directory (directory-local variables), and is smart enough to support two files, so that there can be one file checked into source control for all members of a project and another ignored for one’s personal config.
direnv does exactly what you describe (and a lot more) using flake.nix. cd into the directory and it automatically runs. I use it in every single project/repository to set environment variables and install project-specific dependencies locked to specific versions.
As other comments say, direnv does that, but honestly you should look into mise-en-place (mise) which is really great, and also includes a "mini-direnv"
This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?
What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!
Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.
A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)
This is one of the things I miss the most about hacker conferences. The sharing of tools, scripts, tips and tricks. It was, and still is, just as fun as trading cards.
I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.
Not all time is created equal. I’ll happily invest more time than I’ll ever get back in refining a script or vim config or whatever, so that later, when I’m busy and don’t have time to muck around, I can stay in the flow and not be annoyed by distractions.
Sometimes it's rather matter of sanity than time management. I once created systemd service which goes to company web page and downloads some files which I sometimes need. This script was pretty hacky, and writing it took me a lot of time - probably more than clicking manually on this page in the long run. But clicking it so annoying, that I feel it was totally worth.
If you write these sorts of things in Python, argparse is worth investigating: https://docs.python.org/3/library/argparse.html - it's pretty easy to use, makes it easy to separate the command line handling from the rest of the code, and, importantly, will generate a --help page for you. And if you want something it can't do, you can still always write the code yourself!
I don’t like Python in general, but even so I’ll say that argparse is indeed very nice. When I was writing ruby, I always felt that OptionParser¹ wasn’t as good. Swift has Argument Parser², officially from Apple, which is quite featureful. For shell, I have a a couple of bespoke patterns I have been reusing in every script for many years.
why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.
It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/
For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.
1. even if it costs more time, it could also save more annoyance which could be a benefit
2. by publishing the scripts, anyone else who comes across them can use them and save time without the initial cost. similarly, making and sharing these can encourage others to share their own scripts, some of which the author could save time with
Not all time is created equally though, so I disagree with that xkcd.
If something is time sensitive it is worth spending a disproportionate amount of time to speed things up at some later time. For example if you’re debugging something live, in a live presentation, working on something with a tight deadline etc.
Also you don’t necessarily know how often you’ll do something anyways.
One thing which is often ignored in these discussions is the experience you gain. The time you “wasted” on your previous scripts by taking longer to write them compounds in time saved in the future because you can now write more complex tasks faster.
The problem is, to really internalize that benefit, one would need to have an open mind to trying things out, and many folks seem to resist that. Oh well, more brain connections for me I suppose.
I find that now with AI, you can make scripts very quickly, reducing the time to write them by a lot. There is still some time needed for prompting and testing but still.
>YOU DON'T UNDERSTAND. I NEED TO BE CONSTANTLY OPTIMIZING MY UPTIME. THE SCIENCE DEMANDS IT. TIMEMAXXING. I CAN'T FREELY EXPLORE OR BRAINSTORM, IT'S NOT XKCD 1205 COMPLIANT. I MUST EVALUATE EVERY PROPOSED ACTIVITY AGAINST THE TIME-OPTIMIZATION-PIVOT-TABLE.
It's weird how the circle of life progresses for a developer or whatever.
- When I was a fresh engineer I used a pretty vanilla shell environment
- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP
- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.
I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).
When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install
I gave it a try a few months ago, but did not work for me. My main issue is that atuin broke my workflow with fzf (If I remember correctly, pressing ctrl + r to lookup my shell history did not work well after installing atuin).
I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.
You can apply your dotfiles to servers you SSH into rather easily. I'm not sure what your workflow is like but frameworks like zsh4humans have this built in, and there are tools like sshrc that handle it as well. Just automate the sync on SSH connection. This also applies to containers if you ssh into them.
Do you have experience with these tools? Some such as sshrc only apply temporarily per session and don't persist or affect other users. I keep plain 'ssh' separate from shell functions that apply dotfiles and use each where appropriate. You can also set up temporary application yourself pretty easily.
Sometimes we need to use service accounts, so while you do have your own account all the interesting things happen in svc_foo which you cannot add your .files.
You said you were already using someone else's environment.
You can't later say that you don't.
Whether or not shell access makes sense depends on what you are doing, but a well written application server running in a cloud environment doesn't need any remote shell account.
It's just that approximately zero typical monolithic web applications meet that level of quality and given that 90% of "developers" are clueless, often they can convince management that being stupid is OK.
They do get to work on someone else's server, they do not get a separate account on that server. There client would be not happy to have them mess around with the environment.
If, in the year 2025, you are still using a shared account called "root" (password: "password"), and it's not a hardware switch or something (and even they support user accounts these days), I'm sorry, but you need to do better. If you're the vendor, you need to do better, if you're the client, you need to make it an issue with the vendor and tell them they need to do better. I know, it's easy for me to say from the safety of my armchair at 127.0.0.1. I've got some friends in IT doing support that have some truly horrifying stories. But holy shit why does some stuff suck so fucking much still. Sorry, I'm not mad at you or calling you names, it's the state of the industry. If there were more pushback on broken busted ass shit where this would be a problem, I could sleep better at night, knowing that there's somebody else that isn't being tortured.
The defaults are unbearable. I prefer using chezmoi to feel at home anywhere. There's no reason I can't at least have my aliases.
I'd rather take the pain of writing scripts to automate this for multiple environments than suffer the death by a thousand cuts which are the defaults.
chezmoi is the right direction, but I don't want to have to install something on the other server, I should just be able to ssh to a new place and have everything already set up, via LocalCommand and Host * in my ~/.ssh/config
For anyone else reading this comment who was confused because this seems like the opposite of what you'd expect about Nix: Hacker News ate the asterisks and turned them into italics.
Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.
What does your wife, dog, children, chores, and shopping have to do with custom configuration and scripts? Just set up a Git repo online, put your files there, and take a couple of minutes to improve it incrementally when you encounter inconveniences. And just like that, you made your life easier for a marginal effort.
Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.
I can't say I relate at all (5 years of experience).
They'll have to pry my 1000-line .zshrc from my cold, dead hands.
For example, zsh-autosuggestions improves my quality of life so ridiculously much it's not even funny.
I moved away from 1000 lines .zshrc when I had to do stuff on linux VMs/dockers and I was lost a lot. But you zsh-autosuggestions, and fzf-tab is not going anywhere.
I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."
Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.
(I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))
> When I was a fresh engineer I used a pretty vanilla shell environment. When I got a year or two of experience, I wrote tons of scripts
Does this mean that you learned to code to earn a paycheck? I'm asking because I had written hundreds of scripts and Emacs Lisp functions to optimize my PC before I got my first job.
as a person who loves their computer, my ~/bin is full. i definitely (not that you said this) do not think "everything i do has to be possible on every computer i am ever shelled into"
being a person on a computer for decades, i have tuned how i want to do things that are incredibly common for me
though perhaps you're referring to work and not hobby/life
Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.
On the other hand, the author seems to have a lot of experience as well.
Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.
In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.
For the Infra Engineers out there who still manage fleets of pets, this is double true. You may not have access or be able to use all your shortcut scripts so you better know the raw commands on that unsupported RHEL6 host.
I use a dotfile with aliases and functions, mostly to document / remember commands I find useful. It's been a handy way to build a living document of the utils I use regularly, and is easy to migrate to each new workstation.
I just use the autocd zsh shell option for this. And I also use `hash -d` to define shortcuts for common directories. Then just “executing” something like `~gh/apache/kafka` will cd to the right place.
Given the nature of current operating systems and applications, do you think the idea of “one tool doing one job well” has been abandoned? If so, do you think a return to this model would help bring some innovation back to software development?
Rob Pike: Those days are dead and gone and the eulogy was delivered by Perl.
But was the eulogy written in Perl poetry? I see it everywhere, but I don't know who this JAPH guy is. It's a strange way of spelling Jeff, and it's odd that he types his name in all caps, but he has published a remarkable quantity of works and he's even more famous than the anonymous hacker known as 4chan.
Oh I hate that paradigm. Well, maybe chmod and ls rsync and curl all do they OWN thing very well but every time I am using one of those tools I have to remember if i.e. more detailed response is -v or maybe -vvv or --verbose or -x for some reason because maintainer felt like it at 2:32 in the morning 17 years ago... Some consistency would help, but... Probably it is impossible the flame war over -R being recursive or read-only would never end.
I've heard this often, but I'm going on ~25 years of using Linux, and I would be lost without my dotfiles. They represent years of carefully crafting my environment to suit my preferences, and without them it would be like working on someone else's machine. Not impossible, just very cumbersome.
Admittedly, I've toned down the configs of some programs, as my usage of them has evolved or diminished, but many are still highly tailored to my preferences. For example, you can't really use Emacs without a considerable amount of tweaking. I mean, you technically could, but such programs are a blank slate made to be configured (and Emacs is awful OOB...). Similarly for zsh, which is my main shell, although I keep bash more vanilla. Practically the entire command-line environment and the choices you make about which programs to use can be considered configuration. If you use NixOS or Guix, then that extends to the entire system.
If you're willing to allow someone else to tell you how you should use your computer, then you might as well use macOS or Windows. :)
I prefer using kubectl than any other method so i have plenty of functions to help with that. I'd never consider using python or go for this although I do have plenty of python and go "scripts" on my path too.
The moment of true enlightenment is when you finally decide to once and for all memorize all the arguments and their order for those command line utilities that you use at an interval that's just at the edge of your memory: xargs, find, curl, rsync, etc.
That, plus knowing how to parse a man file to actually understand how to use a command (a skill that takes years to master) pretty much removes the need for most aliases and scripts.
Why would I even attempt to do that? Life is too short to try to remember something like that. Maybe 20 years ago when internet was not that common. Or maybe if you are a hacker, hacking other peoples machines. Me? Just some dev trying yo make some money to feed my family? I prefer to have a walk to the woods.
I already have limited space for long term memory, bash commands are very far down the list of things I'd want to append to my long term storage.
I use ctrl-R with a fuzzy matching program, and let my terminal remember it for me.
And before it's asked: yes that means I'd have more trouble working in a different/someone else's environment. But as it barely ever happens for me, it's hardly an important enough scenario to optimize for.
If you come through the other side, you set up LocalCommand in your .ssh/config which copies your config to every server you ssh to, and get your setup everywhere.
It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing
I use an `anon` function to anonymize my Mac clipboard when I want to paste something to the public ChatGPT, company Slack, private notes, etc. I ran it through itself before pasting it here, for example.
I think they're the same except '.' is POSIX and 'source' is specific to bash and compatible shells. I personally just use source since it's easier to read and zsh and bash account for basically 100% of my shell usage.
Note, fftime copies the audio and video data without re-encoding, which can be a little janky, but often works fine, and can be much (100x) faster on large files. To re-encode just remove "-c copy"
Historical note: getting hold of these scripts by chatting to various developers was the motivation for the original 2004 "lifehacks" talk[1][2]. If you ever get into an online argument over what is a "life hack" and what isn't, feel free to use short scripts like these as the canonical example.
Otherwise, I am happy to be pulled into your discussion, Marshall McLuhan style[3] to adjudicate, for a very reasonable fee.
I'm kicking myself for not thinking of the `nato` script.
I tend to try to not get too used to custom "helper" scripts because I become incapacitated when working in other systems. Nevertheless, I really appreciate all these scripts if nothing else than to see what patterns other programmers pick up.
My only addition is a small `tplate` script that creates HTML, C, C++, Makefile, etc. "template" files to start a project. Kind of like a "wizard setup". e.g.
$ tplate c
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char **argv) {
}
And of course, three scripts `:q`, `:w` and `:wq` that get used surprisingly often:
My fav script to unpack anything, found a few years ago somewhere
# ex - archive extractor
# usage: ex <file>
function ex() {
if [ -f $1 ] ; then
case $1 in
*.tar.bz2) tar xjf $1 ;;
*.tar.gz) tar xzf $1 ;;
*.tar.xz) tar xf $1 ;;
*.bz2) bunzip2 $1 ;;
*.rar) unrar x $1 ;;
*.gz) gunzip $1 ;;
*.tar) tar xf $1 ;;
*.tbz2) tar xjf $1 ;;
*.tgz) tar xzf $1 ;;
*.zip) unzip $1 ;;
*.Z) uncompress $1;;
*.7z) 7z x $1 ;;
*) echo "'$1' cannot be extracted via ex()" ;;
esac
else
echo "'$1' is not a valid file"
fi
}
For compression, I have one for .tar.gz. But it's not that popular in my system. I need something a bit easier than 'pack file file file archive.tar.gz'
Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.
sed, awk, grep, and xargs along with standard utilities get you a long long way.
Same. I interact with too many machines, many of which are ephemeral and will have been reprovisioned the next time I have to interact with it.
I value out of the box stuff that works most everywhere. I have a fairly lightweight zsh config I use locally but it’s mostly just stuff like a status like that suits me, better history settings, etc. Stuff I won’t miss if it’s not there.
I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.
The Gen AI tooling is exceptionally good at doing these sorts of things, and way more than just "mkdir $1 && cd $1". For example:
I have used it to build an "escmd" tool for interacting with Elasticsearch. It makes the available commands much more discoverable, the output it formats in tables, and gets rid of sending JSON to a curl command.
A variety of small tools that interact with Jira (list my tickets, show tickets that are tagged as needing ops interaction in the current release).
A tool to interact with our docker registry to list available tags and to modify tags, including colorizing them based on the sha hash of the image so it's obvious which ones are the same. We manage docker container deploys based on tags so if we "cptag stg prod" on a project, that releases the staging artifact to production, but we also tag them by build date and git commit hash, so we're often working with 5-7 tags.
Script to send a "Software has successfully been released" message via gmail from the command-line.
A program to "waituntil" a certain time to run a command: "waituntil 20:00 && run_release", with nice display of a countdown.
I have a problem with working on too many things at once and then committing unrelated things tagged with a particular Jira case. So I had it write me a commit program that lists my tickets, shows the changed files, and lets me select which ones go with that ticket.
All these are things I could have built before, but would have taken me hours each. With the GenAI, they take 5-15 minutes of my attention to build something like this. And Gen AI seems really, really great at building these small, independent tools.
> ocr my_image.png extracts text from an image and prints it to stdout. It only works on macOS
The Mac Shortcut at https://github.com/e-kotov/macos-shortcuts lets you select a particular area of the screen (as with Cmd-Shift-4) and copies the text out of that, allowing you to copy exactly the text you need from anywhere on your screen with one keyboard shortcut. Great for popups with unselectable text, and copying error messages from coworkers' screenshares.
I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:
#!/bin/sh
if test "$#" != 2
then
echo 'Error: unmv must have exactly 2 arguments'
exit 1
fi
exec mv "$2" "$1"
I find that I like working with the directory stack and having a shortened version of the directory stack in the title bar, e.g. by modifying the stock Debian .bashrc
# If this is an xterm set the title to the directory stack
case "$TERM" in
xterm*|rxvt*)
if [ -x ~/bin/shorten-ds.pl ]; then
PS1="\[\e]0;\$(dirs -v | ~/bin/shorten-ds.pl)\a\]$PS1"
else
PS1="\[\e]0;${debian_chroot:+($debian_chroot)}\u@\h: \w\a\]$PS1"
fi
;;
\*)
;;
esac
The script shorten_ds.pl takes e.g.
0 /var/log/apt
1 ~/Downloads
2 ~
and shortens it to:
0:apt 1:Downloads 2:~
#!/usr/bin/perl -w
use strict;
my @lines;
while (<>) {
chomp;
s%^ (\d+) %$1:%;
s%:.*/([^/]+)$%:$1%;
push @lines, $_
}
print join ' ', @lines;
That coupled with functions that take 'u 2' as shorthand for 'pushd +2' and
'o 2' for 'popd +2' make for easy manipulation of the directory stack:
u() {
if [[ $1 =~ ^[0-9]+$ ]]; then
pushd "+$1"
else
pushd "$@"
fi
}
o() {
if [[ $1 =~ ^[0-9]+$ ]]; then
popd "+$1"
else
popd "$@" # lazy way to cause an error
fi
}
I have mkcd exactly ( I wonder how many of us do, it's so obvious)
I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.
I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.
I used to have one called timespeak that would speak the time to me every hour or half hour.
I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.
I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.
I have three different way to open file with vim:
v: vim (or neovim, in my case)
vv: search/preview and open file by filename
vvv: search/preview and open file by its content
alias v='nvim'
alias vv='f=$(fzf --preview-window "right:50%" --preview "bat --color=always {1}"); test -n "$f" && v "$f"'
alias vvv='f=$(rg --line-number --no-heading . | fzf -d: -n 2.. --preview-window "right:50%:+{2}" --preview "bat --color=always --highlight-line {2} {1}"); test -n "$(echo "$f" | cut -d: -f1)" && v "+$(echo "$f" | cut -d: -f2)" "$(echo "$f" | cut -d: -f1)"'
Nice! Tangentially related: I built a (MacOS only) tool called clippy to be a much better pbcopy. It was just added to homebrew core. Among other things, it auto-detects when you want files as references so they paste into GUI apps as uploads, not bytes.
clippy image.png # then paste into Slack, etc. as upload
clippy -r # copy most recent download
pasty # copy file in Finder, then paste actual file here
Adding the word "then" to your first comment would have helped me: (lacking context, I thought the comments explained what the command does, as is common convention)
clippy image.png # then paste into Slack, etc. as upload
Also:
pasty # paste actual file, after copying file in Finder
This is really interesting, but I need the highlights reel. So I need a script to summarize Hacker News pages and/or arbitrary web pages. Maybe that's what I want for getting the juice out of Medium articles.
Broadly, I very much love this approach to things and wish it was more "acceptable?" It reminds me of the opposite of things like "the useless use of cat" which to me is one of the WORST meme-type-things in this space.
Like, it's okay -- even good -- for the tools to bend to the user and not the other way around.
Using 'copy' as a clipboard script tells me OP never lived through the DOS era I guess... Used to drive me mad switching between 'cp' in UNIX and 'copy' in DOS.
(Same with the whole slash vs backslash mess.)
As a programmer, you sometimes want to make an alphabet lookup table. So, something like:
var alpha_lu = "abcdefghijklmnopqrstuvwxyz";
Typing it out by hand is error prone as it's not easy to see if you've swapped the order or missed a character.
I've needed the alphabet string or lookup rarely, but I have needed it before. Some applications could include making your own UUID function, making a small random naming scheme, associating small categorical numbers to letters, etc.
The author of article mentioned they do web development, so it's not hard to imagine they've had to create a URL shortener, maybe more than once. So, for example, creating a small name could look like:
function small_name(len) {
let a = "abcdefghijklmnopqrstuvwxyz",
v = [];
for (let i=0; i<len; i++) {
v.push( a[ Math.floor( Math.random()*a.length ) ] );
}
return v.join("");
}
//...
small_name(5); // e.g. "pfsor"
Dealing with strings, dealing with hashes, random names, etc., one could imagine needing to do functions like this, or functions that are adjacent to these types of tasks, at least once a month.
If your native language uses a different alphabet, you might not have been taught "the alphabet song". For example, I speak/read passable Russian, but could not alphabetize a list in Russian.
For me it's when I call customer service or support on the phone, and either give them an account #, or confirm a temporary password that I have been verbally given.
In fish, I have an abbreviation that automatically expands double dots into ../ so that you can just spam double dots and visually see how far you're going.
# Modified from
# https://github.com/fish-shell/fish-shell/issues/1891#issuecomment-451961517
function append-slash-to-double-dot -d 'expand .. to ../'
# Get commandline up to cursor
set -l cmd (commandline --cut-at-cursor)
# Match last line
switch $cmd[-1]
case '*.'
commandline --insert './'
case '*'
commandline --insert '.'
end
end
Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.
Good point, when working with keybindings, you'll inevitably end up overriding built-ins. I see it as a trade-off, between something I don't know of (and wouldn't use) and something I find useful. Works for me :)
absolutely. From back in the day, the annoying one was GNU screen, which took over ctrl-a by default. Overrode that to be ctrl-^, which in bash is transpose, make "zx be "xz", which was rare enough to okay with losing.
My most important script has been to remap CapsLock as a kind of custom Meta key, that transforms (when pressed) the Space into Return, hjkl into arrows, io into PgUp/PgDn, and 1-9 into function keys. Now I have a 60% keyboard that takes 0 space on my desk. And I am reaaaally happy with this setup.
[that, plus LinkHint plugin for Firefox, and i3 for WM is my way to go for a better life]
17 years ago I wrote a short VBA macro that takes the high life’s range of cells, concatenates the values into a comma separated list, then opens the list in notepad for easy copy and further use. I can’t begin to count the number of executions by myself and those i have shared it with.
I wrote it in a way that's too intertwined with my other shit to be shareable with people, but honestly you can copy-paste my comment to your friendly neighborhood LLM and you'll get something decent. Indeed it uses `env`.
> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.
You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.
> url "$my_url" parses a URL into its parts. I use this about once a month to pull data out of a URL, often because I don’t want to click a nasty tracking link.
This sounds pretty useful!
Coincidentally, I have recently learned that Daniel Stenberg et al (of cURL fame) wrote trurl[1], a libcurl-based CLI tool for URL parsing. Its `--json` option seems to yield similar results as TFA's url, if slightly less concise because of the JSON encoding. The advantage is that recent releases of common Linux distros seem to include trurl in their repos[2].
I have a script called catfiles that I store in ~/.local/bin that recursively dumps every source file with an associated file header so I can paste the resulting blob in to Gemini and ChatGPT in order to have a conversation about the changes I would like to make before I send off the resulting prompt to Gemini Code Assist.
Heres my script if anyone is interested in as I find it to be incredibly useful.
Where vtt2txt is a python script — slightly too long to paste here — which strips out the subtitle formatting, leaving a (mostly) human readable transcript.
I started writing way more utility scripts when I found babashka. Magic of clojure, instant startup, easy to shell out to any other command, tons of useful built in stuff, developing with the REPL. It’s just a good time!!
I had my hopes on this project RawDog using local smol sized LLMs but it hasn't been updated in a while. I feel like all this should be running easily in the background nowadays.
The scripts from my junk drawer (https://github.com/peterwwillis/junkdrawer) I use every day are 'kd' and 'gw', which use the Unix dialog command to provide an easy terminal UI for Kubectl and Git Worktrees (respectively)... I probably save 15+ minutes a day just flitting around in those UIs. The rest of the scripts I use for random things; tasks in AWS/Git/etc I can never remember, Terraform module refactoring, Bitbucket/GitHub user management, Docker shortcuts, random password generation, mirroring websites with Wget, finding duplicate files, etc.
Obviously, to each their own, but to me, this is an overwhelming amount of commands to remember on top of all the ones they are composed of that you will likely need to know anyway — regardless if all the custom ones exist.
Like, I'd have to remember both `prettypath` and `sed`, and given that there's hardly any chance I'll not need `sed` in other situations, I now need to remember two commands instead of one.
On top of that `prettypath` only does s/:/\\n/ on my path, not on other strings, making its use extremely narrow. But generally doing search and replace in a string is incredibly useful, so I'd personally rather just use `sed` directly and become more comfortable with it. (Or `perl`, but the point is the same.)
As I said, that's obviously just my opinion, if loads of custom scripts/commands works for you, all the more power to you!
-I replace-str
Replace occurrences of replace-str in the initial-arguments
with names read from standard input. Also, unquoted blanks
do not terminate input items; instead the separator is the
newline character. Implies -x and -L 1.
I've started using snippets for code reviews, where I find myself making the same comments (for different colleagues) regularly. I have a keyboard shortcut opening a fuzzy search to find the entry in a single text file. That saves a lot of time.
As an aside, I find most of these commands very long. I tend to use very short aliases, ideally 2 characters. I'm assuming the author uses tab most of the time, if the prefixes don't overlap beyond 3 characters it's not that bad, and maybe the history is more readable.
One of my biggest headaches is stripping specific number of bytes from the head or tail of a binary file. and I couldn't find any built-in tool for that, so I wrote one in C++.
Rad is built specifically for writing CLI scripts and is perfect for these sorts of small to medium scripts, takes a declarative approach to script arguments, and has first-class shell command integration. I basically don't write scripts in anything else anymore.
> `nato bar` returns Bravo Alfa Romeo. I use this most often when talking to customer service and need to read out a long alphanumeric string, which has only happened a couple of times in my whole life. But it’s sometimes useful!
Even more useful is just learning the ICAO Spelling Alphabet (aka NATO Phonetic Alphabet, of which it is neither). It takes like an afternoon and is useful in many situations, even if the receiver does not know it.
Some time ago I tried to tell my email address to someone in Japan over the phone who did not speak English very well. It turned out to be basically impossible. I realized later one could probably come up with a phonetic alphabet of English words most Japanese know!
I got a ccurl python script that extracts the cookies from my Firefox profile and then passes those on to curl, that way I can get webpages where I'm logged in.
As a fun game, I suggest feeding the entire piece to an LLM and asking it to create those scripts. The differences between Claude, GOT-5 and Gemini are very interesting.
As a bonus, I prepend my custom aliases or scripts with my user name and hyphen (i.e helicaltwine-). It helps me recall rarely used scripts when I need them and forget the names.
I follow a similar but more terse pattern. I prepend them all with a comma, and I have yet to find any collisions. If you're using bash (and I assume posix sh as well), the comma character has no special meaning, so this is quite a nice use for it. I agree that it's nice to type ",<tab>" and see all my custom scripts appear.
This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.
Nice. I have a bash script similar to the one listed "removeexif" called prep_for_web which takes any image file (PNG, BMP, JPG, WebP), scrubs EXIF data, checks for transparency and then compresses it to either JPG using MozJPEG or to PNG using PNGQuant.
An important advantage of aliases was not mentioned: I see everything in one place and can easily build aliases on top of other aliases without much thinking.
Anyways, my favourite alias that I use all the time is this:
alias a='nvim ~/.zshrc && . ~/.zshrc'
It solves the ,,not loaded automatically'' part at least for the current terminal
A subprocess (git) can't modify the working directory of the parent process (the shell). This is a common annoyance with file managers like yazi and ranger as well—you need an extra (usually manual!) installation step to add a shell integration for whichever shell you're using so the shell itself can change directory.
The best solution for automatically cd'ing into the repo is to wrap git clone in a shell function or alias. Unfortunately I don't think there's any way to make git clone print the path a repository was cloned to, so I had to do some hacky string processing that tries to handle the most common usage (ignore the "gh:" in the URL regex, my git config just expands it to "git@github.com:"):
if you use x0vnc (useful if you use a linux machine both from the attached screen and from vnc, and in a bunch of other scenarios), copy and paste to and fro the vnc client is not implemented, quite frustrating. here's 2 scripts that does that for you, I now use this all day. https://github.com/francoisp/clipshare
The nato phonetic alphabet one cracked me up. My dude you don't need that, call center employees don't know it, just say S as in Sugar like ur grandma used to.
The NATO alphabet is made of commonly known words that are hard to misspell and hard to mishear (at least the initial phonemes). The person on the other end doesn't need to be able to recite the words, they just need to be able to hear "november" and recognize that it starts with N.
The nato phonetic alphabet is still useful even if the other party doesn't know it, I've used it a bunch of times on the phone to spell out my 10- letter last name. Saves quite a lot of time and energy for me vs saying "letter as in word" for each letter.
Exactly. The listening party doesn't need to have knowledge of the NATO alphabet to still benefit from it since they are just regular English words.
I once had someone sound out a serial number over a spotty phone connection years ago and they said "N as in NAIL". You know what sounds a lot like NAIL? MAIL.
And that is why we don't just arbitrarily make up phonetic alphabets.
I dunno, there's a pretty good chance that the one that people spent time and effort designing to replace earlier efforts with the goal of reducing potential ambiguity and for use over noisy connections with expectation that mistakes could cost lives is probably better than what you improvise on the spot
When I worked in customer service, I asked a teammate what I could do to spell back something the customer said, and she taught me that system, it helped me a lot.
Then I turned it into an alias, called it "serveit" and tweeted about it. And now I see it as a bash script, made a little bit more robust in case python is not installed :)
The "scripts" I use the most that I am most happy with are a set of Vagrant tools that manage initialising different kinds of application environments with an apt cache on the host. Also .ssh/config includes to make it as easy as possible to work with them from VSCode.
I set this stuff up so long ago I sort of forgot that I did it at all; it's like a standard feature. I have to remember I did it.
I hope to see an operating system with these scripts as built-in, because they are so intuitive and helpful! Which OS will be the first to take this on?
no offense but a lot of those script are pretty hacky they may work for the user but i would not use them without making sure to review them and adapt them to my workflow
That's a fair point. I think the author intended the post to be a treasure trove of ideas for your own scripts, not as something to blindly include in your daily workflow.
I have a bunch, but one that I rarely see mentioned but use all the time is memo(1) (https://github.com/aktau/dotfiles/blob/master/bin/memo).
It memoizes the command passed to it.
Manually clearing it (for example if I know the underlying data has changed: In-pipeline memoization (includes the input in the hash of the lookup): This allows me to rapidly iterate on shell pipelines. The main goal is to minimize my development latency, but it also has positive effects on dependencies (avoiding redundant RPC calls). The classic way of doing this is storing something in temporary files: But I find this awkward, and makes it harder than necessary to experiment with the expensive command itself. Both of those will run curl once.NOTE: Currently environment variables are not taken into account when hashing.
You're gonna absolutely love up (https://github.com/akavel/up).
If you pipe curl's output to it, you'll get a live playground where you can finesse the rest of your pipeline.
up(1) looks really cool, I think I'll add it to my toolbox.
It looks like up(1) and memo(1) have similar use cases (or goals). I'll give it a try to see if I can appreciate its ergonomics. I suspect memo(1) will remain my mainstay:
Another thing I built into memo(1) which I forgot to mention: automatic compression. memo(1) will use available (de)compressors (in order of preference: zstd, lz4, xz, gzip) to (de)compress stored contents. It's surprising how much disk space and IOPS can be saved this way due to redundancy.
I currently only have two memoized commands:
That's roughly 10x compression ratio.This is terrific! I curl to files and then pipe them, all the time. This will be a great help.
I wonder if we have gotten to the point where we can feed an LLM our bash history and it could suggest improvements to our workflow.
Caching some API call because it is expensive and use cached data many months later because of bash suggestion :(
The default storage location for memo(1) output is /tmp/memo/${USER}. Most distributions either have some automatic periodic cleanup, and/or wipe it on restart.
Separately from that:
In practice, this has never been a problem for me, and I've used this hacked together command for years.15 years of Linux and I learn something new all the time...
Its why I keep coming back, now how do I remember to use this and not go back to using tmpfiles :)
I use Warp terminal for couple of years, and recently they embeeded AI into it. At first I was irritated, disabled it, but AI Agent is built in as an optional mode (Cmd-I to toggle). And I found myself using it more and more often for commands that I have no capacity or will to remember or dig through the man pages (from "figure out my IP address on wifi interface" to "make ffmpeg do this or that"). It's fast and can iterate over own errors, and now I can't resist using it regularly. Removes the need for "tools to memorize commands" entirely.
> `curl ... | jq . | awk '...'`
Uhm, jq _is_ as powerful (more) as awk. You can use jq directly and skip awk.
(I know, old habits die hard, and learning functional programming languages is not easy.)
i see no way to name the memo in your examples, so how do you refer to them later?
also, this seems a lot like an automated way to write shell scripts that you can pipe to and from. so why not use a shell script that won't surprise anyone instead of this, which might?
Dude, this is _awesome_. Thank you for sharing!
Glad you like it. Hope you get as much use of it as me.
> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.
The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.
> jsonformat takes JSON at stdin and pretty-prints it to stdout.
Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
> uuid prints a v4 UUID. I use this about once a month.
Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
https://www.man7.org/linux/man-pages/man1/uuidgen.1.html
I am not the author, but my bet is that he didn't know of its existence.
The best part about sharing your config or knowledge is that someone will always light up your blind spots.
> The best part about sharing your config or knowledge is that someone will always light up your blind spots.
Yes! I will take this as a chance to thank every people who shared their knowledge on the Internet. You guys are so freaking awesome! You are always appreciated.
A big chunk of my whole life's learning came from all the forums that I used to scour through, hours after hour! Because these awesome people always sharing their knowledge, and someone adding more. That's what made Internet, Internet. And all is now almost brink of loss, because of greedy corporates.
This habit also helped me with doom-scrolling. I sometimes, do doomscroll, but I can catch it quickly and snap out of it. Because, my whole life, I always jumped in to the rabbit holes, and actually read those big blog posts, where you had those `A-ha` moments, "Oohh, I can use that", "Ahh, that's clever!".
When, browsing, do not give me that, by brain actually triggers, "What are you doing?"
Later, I got lazy, which I am still paying for. But I am going to get out of it.
Never stop jumping into those rabbit holes!! Well, obviously, not always it's a good rabbit hole, but you'll probably come out wiser.
Or more abstractly: post anything to the internet and people will always detail how you’re wrong. Sometimes that can be useful.
That seems to be especially true on HN. Other forums there is some of that as well, but HN it seems nearly every single comment section is like 75% (random number) pointing out faults in the posted article.
Although I normally loathe pedantic assholes, I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).
I've found that pedantic conversations here seem to actually have a greater potential for me to learn something from them than other forums/social platforms. On other platforms, I see someone providing a pedantic response and I'll just keep moving on, but on HN, I get curious to not only see who wins the nerd fight, but also that I might learn at least one thing along the way. I like that it's had an effect on how I engage with comment sections.
And the worst of it gets flagged and even dead-ed so most skip it after a bit, as I assumed would happen recently
https://news.ycombinator.com/item?id=45649771
Yes, flagging mechanism on HN is evil.
I have showdead on, and almost every single flagged post I've seen definitely deserves it. Every time it wasn't "deserved", the person simply took an overly aggressive tone for no real reason.
In short, I've never seen somebody flagged simply for having the wrong opinion. Even controversial opinions tend to stay unflagged, unless they're incredibly dangerous or unhinged.
I've seen a few dead posts where there was an innocent misunderstanding or wrong assumption. In those cases it would have been beneficial to keep the post visible and post a response, so that readers with similarly mistaken assumptions could have seen a correction. Small minority of dead posts though. They can be vouched for actually but of course this is unlikely to happen.
I agree that most dead posts would be a distraction and good to have been kept out.
It’s a blunt tool, but quite useful for posts. I read most dead posts I come across and I don’t think I ever saw one that was not obviously in violation of several guidelines.
OTOH I don’t like flagging stories because good ones get buried regularly. But then HN is not a great place for peaceful, nuanced discussion and these threads often descend into mindless flame wars, which would bury the stories even without flagging.
So, meh. I think flagging is a moderately good thing overall but it really lacks in subtlety.
> I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).
Can you back this up with data? ;-)
I see citations and links to sources about as little as on reddit around here.
The difference I see is in the top 1% comments, which exist in the first place, and are better on average (but that depends on what other forums or subreddits you compare it to, /r/AskHistorians is pretty good for serious history answers for example), but not in the rest of the comments. Also, less distractions, more staying on topic, the joke replies are punished more often and are less frequent.
I find that endearing for two reasons:
- either critique is solid and I learn something
- or commenter is clueless which makes it entertaining
there is very seldom a “middle”
Yea I don't particularly mind it, just an interesting thing about HN compared to many other forums.
*fora
True true, one of my favorite things is watching the shorts on home improvement or 'hacks' and sure enough there is always multiple comments saying why it won't work and why its not the right way. Just as entertaining as the video.
This is, incidentally, codified as Cunningham's Law.
https://meta.wikimedia.org/wiki/Cunningham%27s_Law
...aaand less directly (though referenced in the wikipedia article)...
https://xkcd.com/386/
Exactly! I didn’t know macOS ships JQ or the uuidgen tool. Very cool
also possible (even though I've seen the author's response to not knowing) is that the scripts were written before native was included. at that point, the muscle memory is just there. I know I have a few scripts like that myself
> Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
That was my thought. I use jq to pretty print json.
What I have found useful is j2p and p2j to convert to/from python dict format to json format (and pretty print the output). I also have j2p_clip and p2j_clip, which read from and then write to the system clipboard so I don't have to manually pipe in and out.
> Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
I also made a uuid, which just runs uuidgen, but then trims the \n. (And maybe copied to clipboard? It was at my old job, and I don't seem to have saved it to my personal computer.)
Python also pretty-prints out of the box:
Other examples where native features are better than these self-made scripts...
> vim [...] I select a region and then run :'<,'>!markdownquote
Just select the first column with ctrl-v, then "i> " then escape. That's 4 keys after the selection, instead of 20.
> u+ 2025 returns ñ, LATIN SMALL LETTER N WITH TILDE
`unicode` is widely available, has a good default search, and many options. BTW, I wonder why "2025" matched "ñ".
> catbin foo is basically cat "$(which foo)"Since the author is using zsh, `cat =foo` is shorter and more powerful. It's also much less error-prone with long commands, since zsh can smartly complete after =.
I use it often, e.g. `file =firefox` or `vim =myscript.sh`.
> `unicode` is widely available
It's not installed by default on macOS or Ubuntu, for me.
You are right but
and it did. So it really was available. That's Debian 11.Shoutout to rip as an alternative to rm and trash:
https://github.com/nivekuil/rip
`trash` is good to know, thanks! I'd been doing: "tell app \"Finder\" to move {%s} to trash" where %s is a comma separated list of "the POSIX file <path-to-file>".
Oooh, I just suggested in another comment that using applescript would be possible. I didn't think it would be this easy though.
I believe it would be possible to execute an applescript to tell the finder to delete the files in one go. It would theoretically be possible to construct/run the applescript directly in a shell script. It would be easier (but still not trivial) to write an applescript file to take the file list as an argument to then delete when calling from the shell.
It’s not theoretical, and it is trivial. Like I said, I did exactly that for years. Specifically, I had a function in my `.zshrc` to expand all inputs to their full paths, verify and exclude invalid arguments, trash the rest in one swoop, then show me an error with the invalid arguments, if any.
Trash command first appeared in macOS 14.0, which was 2023.
I do `mv a.txt /tmp` instead of `rm`.
> Why prioritise node instead of jq?
In powershell I just do
But as a functionand it's `New-Guid` in PowerShell.
Instead of trash, reimplementing rm (to only really delete after some time or depending on resource usage or to shred of you are paranoid if the goal is to really delete something) or using zfs makes much more sense.
I can't imagine a scenario where I would want to reimplement rm just for this.
[flagged]
https://news.ycombinator.com/newsguidelines.html
> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.
Instead of being rude to a fellow human making an inoffensive remark, you could’ve spent your words being kind and describing the scenario you claim exists. For all you know, maybe they did ask ChatGPT and were unconvinced by the answer.
As a side note, I don’t even understand how your swipe would make sense. If anything, needing ChatGPT is what demonstrates a lack of imagination (having the latter you don’t need the former).
What makes you think I need ChatGPT, since I just wondered whether ChatGPT was as stupid, since obviously I do know why that would be useful?
How is this better?
For trash on macOS, I recommend https://github.com/ali-rantakari/trash
Does all the right things and works great.
There’s a similar tool that works well on Linux/BSDs that I’ve used for years, but I don’t have my FreeBSD desktop handy to check.
The trash command for macOS that's being talked about above is native in the OS now, since v14 according to its manpage, though I see it may have really been v15[1]
1: https://mjtsai.com/blog/2025/08/26/the-trash-command/
I've written on this before, but I have an extensive collection of "at" scripts. This started 25+ years ago when I dragged a PC tower running BSD to a friend's house, and their network differed from mine. So I wrote an @friend script which did a bunch of ifconfig foo.
Over time that's grown to an @foo script for every project I work on, every place I frequent that has some kind of specific setup. They are prefixed with an @ because that only rarely conflicts with anything, and tab-complete helps me remember the less frequently used ones.
The @project scripts setup the whole environment, alias the appropriate build tools and versions of those tools, prepare the correct IDE config if needed, drop me in the project's directory, etc. Some start a VPN connection because some of my clients only have git access over VPN etc.
Because I've worked on many things over many years, most of these scripts also output some "help" output so I can remember how shit works for a given project.
Here's an example:
Edit: a word on aliases, I frequently alias tools like maven or ansible to include config files that are specific to that project. That way I can have a .m2 folder for every project that doesn't get polluted by other projects, I don't have to remember to tell ansible which inventory file to use, etc. I'm lazy and my memory is for shit.Slightly related but mise, a tool you can use instead of eg make, has “on enter directory” hooks that can reconfigure your system quite a bit whenever you enter the project directory in the terminal. Initially I was horrified by this idea but I have to admit it’s been quite nice to enter into a directory and everything is set up just right, also for new people joining. It has built in version management of just about every command line tool you could imagine, so that an entire team can be on a consistent setup of Python, Node, Go, etc.
I see other people mentioning env and mise does this too, with additional support to add on extra env overrides with a dedicated file such as for example .mise.testing.toml config and running something like:
MISE_ENV=testing bun run test
(“testing” in this example can be whatever you like)
This is very useful to me and I had no idea, thanks for pointing that feature out!
I'm stealing the top comment here because you probably know what I'm asking.
I've always wanted a linux directory hook that runs some action. Say I have a scripts dir filled with 10 different shells scripts. I could easily have a readme or something to remember what they all do.
What I want is some hook in a dir that every time I cd into that dir it runs the hook. Most of the time it would be a simple 'cat usage.txt' but sometimes it maybe 'source .venv/bin/activate'.
I know I can alias the the cd and the hook together but I don't want that.
I recommend direnv for that: https://direnv.net/
Its intended use case is loading environment variables (you could use this to load your virtualenv), but it works by sourcing a script — and that script can be ‘cat usage.txt.’
Great tool.
If you use Emacs (and you should!), there’s a direnv mode. Emacs also has its own way to set configuration items within a directory (directory-local variables), and is smart enough to support two files, so that there can be one file checked into source control for all members of a project and another ignored for one’s personal config.
direnv does exactly what you describe (and a lot more) using flake.nix. cd into the directory and it automatically runs. I use it in every single project/repository to set environment variables and install project-specific dependencies locked to specific versions.
> direnv does exactly what you describe (and a lot more) using flake.nix
Direnv is awesome! Note, thought, that it does not depend on Nix, just a Unix-like OS and a supported shell: https://direnv.net/#prerequisites
As other comments say, direnv does that, but honestly you should look into mise-en-place (mise) which is really great, and also includes a "mini-direnv"
This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?
What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!
Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.
A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)
This is one of the things I miss the most about hacker conferences. The sharing of tools, scripts, tips and tricks. It was, and still is, just as fun as trading cards.
I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.
Not all time is created equal. I’ll happily invest more time than I’ll ever get back in refining a script or vim config or whatever, so that later, when I’m busy and don’t have time to muck around, I can stay in the flow and not be annoyed by distractions.
Sometimes it's rather matter of sanity than time management. I once created systemd service which goes to company web page and downloads some files which I sometimes need. This script was pretty hacky, and writing it took me a lot of time - probably more than clicking manually on this page in the long run. But clicking it so annoying, that I feel it was totally worth.
> reference them when forgetting syntax
If you have to do that, the script needs improvement. Always add a `--help` which explains what it does and what arguments it takes.
If you write these sorts of things in Python, argparse is worth investigating: https://docs.python.org/3/library/argparse.html - it's pretty easy to use, makes it easy to separate the command line handling from the rest of the code, and, importantly, will generate a --help page for you. And if you want something it can't do, you can still always write the code yourself!
I don’t like Python in general, but even so I’ll say that argparse is indeed very nice. When I was writing ruby, I always felt that OptionParser¹ wasn’t as good. Swift has Argument Parser², officially from Apple, which is quite featureful. For shell, I have a a couple of bespoke patterns I have been reusing in every script for many years.
¹ https://github.com/ruby/optparse
² https://github.com/apple/swift-argument-parser
why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.
It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/
For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.
Maybe, but
1. even if it costs more time, it could also save more annoyance which could be a benefit
2. by publishing the scripts, anyone else who comes across them can use them and save time without the initial cost. similarly, making and sharing these can encourage others to share their own scripts, some of which the author could save time with
Sometimes, you explore to have ideas. By fixing a few problems like these, you learn about technologies that can help you in another situation.
Not all time is created equally though, so I disagree with that xkcd.
If something is time sensitive it is worth spending a disproportionate amount of time to speed things up at some later time. For example if you’re debugging something live, in a live presentation, working on something with a tight deadline etc.
Also you don’t necessarily know how often you’ll do something anyways.
> I disagree with that xkcd
The xkcd doesn't seem to be pushing an agenda, just providing a lookup table. Time spent vs time saved is factual.
One thing which is often ignored in these discussions is the experience you gain. The time you “wasted” on your previous scripts by taking longer to write them compounds in time saved in the future because you can now write more complex tasks faster.
The problem is, to really internalize that benefit, one would need to have an open mind to trying things out, and many folks seem to resist that. Oh well, more brain connections for me I suppose.
I find that now with AI, you can make scripts very quickly, reducing the time to write them by a lot. There is still some time needed for prompting and testing but still.
https://raw.githubusercontent.com/fliptheweb/bash-shortcuts-... has served me very well.
Wow thanks, I'm tattooing this on my right hand now. :)
It's weird how the circle of life progresses for a developer or whatever.
- When I was a fresh engineer I used a pretty vanilla shell environment
- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP
- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.
I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).
When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install
Atuin is new to me!
https://github.com/atuinsh/atuin
Discussed 4 months ago:
Atuin – Magical Shell History https://news.ycombinator.com/item?id=44364186 - June 2025, 71 comments
I gave it a try a few months ago, but did not work for me. My main issue is that atuin broke my workflow with fzf (If I remember correctly, pressing ctrl + r to lookup my shell history did not work well after installing atuin).
This is configurable! I use atuin, but fzf with ctrl-r.
I'm sympathetic, also a longtime fzf user here. I install it reflexively on any system I use for more than a day or two.
I like atuin but why is it so slow when first opening (hitting up) in the shell?
I'd recommend disabling atuin when hitting up and just leave it on ctrl+r instead
Either it wasn't a design goal or they are stupid. Why don't you tell us?
The right way this would work is via a systemd service and then it should be instant.
I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.
You can apply your dotfiles to servers you SSH into rather easily. I'm not sure what your workflow is like but frameworks like zsh4humans have this built in, and there are tools like sshrc that handle it as well. Just automate the sync on SSH connection. This also applies to containers if you ssh into them.
I'm guessing you haven't worked in Someone Else's environment?
The amount of shit you'll get for "applying your dotfiles" on a client machine or a production server is going to be legendary.
Same with containers, please don't install random dotfiles inside them. The whole point of a container is to be predictable.
Do you have experience with these tools? Some such as sshrc only apply temporarily per session and don't persist or affect other users. I keep plain 'ssh' separate from shell functions that apply dotfiles and use each where appropriate. You can also set up temporary application yourself pretty easily.
In other replies you explicitly state how rare it is that you log in to other systems.
Aren't you therefore optimizing for 1% of the cases, but sabotaging the 99%?
Someone else's environment? That should never happen. You should get your own user account and that's it.
Sometimes we need to use service accounts, so while you do have your own account all the interesting things happen in svc_foo which you cannot add your .files.
I don’t even get an account on someone else’s server. There’s no need for me to log in anywhere unless it’s an exceptional situation.
This doesn't make sense.
You said you were already using someone else's environment.
You can't later say that you don't.
Whether or not shell access makes sense depends on what you are doing, but a well written application server running in a cloud environment doesn't need any remote shell account.
It's just that approximately zero typical monolithic web applications meet that level of quality and given that 90% of "developers" are clueless, often they can convince management that being stupid is OK.
They do get to work on someone else's server, they do not get a separate account on that server. There client would be not happy to have them mess around with the environment.
By definition, it the client Alice gives contractor Mallory access to user account alice, that's worse than giving them an account called mallory.
Accounts are basically free. Not having accounts; that's expensive.
If, in the year 2025, you are still using a shared account called "root" (password: "password"), and it's not a hardware switch or something (and even they support user accounts these days), I'm sorry, but you need to do better. If you're the vendor, you need to do better, if you're the client, you need to make it an issue with the vendor and tell them they need to do better. I know, it's easy for me to say from the safety of my armchair at 127.0.0.1. I've got some friends in IT doing support that have some truly horrifying stories. But holy shit why does some stuff suck so fucking much still. Sorry, I'm not mad at you or calling you names, it's the state of the industry. If there were more pushback on broken busted ass shit where this would be a problem, I could sleep better at night, knowing that there's somebody else that isn't being tortured.
It’s 2025. I don’t even have the login password to any server, they’re not unicorns, they’re cattle.
If something is wrong with a server, we terminate it and spin up a new one. No need for anyone to log in.
In very rare cases it might be relevant to log in to a running server, but I haven’t done that in years.
The defaults are unbearable. I prefer using chezmoi to feel at home anywhere. There's no reason I can't at least have my aliases.
I'd rather take the pain of writing scripts to automate this for multiple environments than suffer the death by a thousand cuts which are the defaults.
chezmoi is the right direction, but I don't want to have to install something on the other server, I should just be able to ssh to a new place and have everything already set up, via LocalCommand and Host * in my ~/.ssh/config
When I had one nix computer, I wanted to customize it heavily.
Now I have many nix computers and I want them consistent and with only the most necessary packages installed.
For anyone else reading this comment who was confused because this seems like the opposite of what you'd expect about Nix: Hacker News ate the asterisks and turned them into italics.
use a backslash. \*
(had to use a double backslash to render that correctly)
Or two consecutive asterisks: ** becomes *
Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.
What does your wife, dog, children, chores, and shopping have to do with custom configuration and scripts? Just set up a Git repo online, put your files there, and take a couple of minutes to improve it incrementally when you encounter inconveniences. And just like that, you made your life easier for a marginal effort.
They compete for time.
Don't even try to explain the scripts to wife*, try the dog. At least he'll understand it just as much and be enthusiastic to hear it!
*may not be applicable to all wives, ymmv.
I thought my wife latex, she loves me for it :D
I'm saying that makes no sense, as I've wrote in the comment you're replying to.
I don't get why this is a problem. Just stick all your configs in a git repo and clone it wherever you need it.
I would still call my Python scripts “scripts.” I don’t think the term “scripts” is limited to shell scripts.
Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.
I can't say I relate at all (5 years of experience). They'll have to pry my 1000-line .zshrc from my cold, dead hands. For example, zsh-autosuggestions improves my quality of life so ridiculously much it's not even funny.
I moved away from 1000 lines .zshrc when I had to do stuff on linux VMs/dockers and I was lost a lot. But you zsh-autosuggestions, and fzf-tab is not going anywhere.
I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."
Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.
(I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))
I've been programming 30 years and I really don't find it a hassle:
- if you commit them to git, they last your entire career
- improving your setup is basically compound interest
- with a new laptop, my setup script might cause me 15 minutes of fixing a few things
- the more you do it, the less any individual hassle becomes, and the easier it looks to make changes – no more "i don't have time" mindset
> When I was a fresh engineer I used a pretty vanilla shell environment. When I got a year or two of experience, I wrote tons of scripts
Does this mean that you learned to code to earn a paycheck? I'm asking because I had written hundreds of scripts and Emacs Lisp functions to optimize my PC before I got my first job.
this is how it works for you
as a person who loves their computer, my ~/bin is full. i definitely (not that you said this) do not think "everything i do has to be possible on every computer i am ever shelled into"
being a person on a computer for decades, i have tuned how i want to do things that are incredibly common for me
though perhaps you're referring to work and not hobby/life
Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.
On the other hand, the author seems to have a lot of experience as well.
Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.
edit: In the case of the author I guess he's probably wants to live in the terminal full time. And perhaps offline. there is a lot of static data he's stored like http status codes: https://codeberg.org/EvanHahn/dotfiles/src/commit/843b9ee13d...
In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.
For the Infra Engineers out there who still manage fleets of pets, this is double true. You may not have access or be able to use all your shortcut scripts so you better know the raw commands on that unsupported RHEL6 host.
I use a dotfile with aliases and functions, mostly to document / remember commands I find useful. It's been a handy way to build a living document of the utils I use regularly, and is easy to migrate to each new workstation.
man, i couldn't live without alias ..='cd ..' or alias ...='cd ../..'
to this day, i still get tripped up when using a shell for the first time without those as they're muscle memory now.
I just use the autocd zsh shell option for this. And I also use `hash -d` to define shortcuts for common directories. Then just “executing” something like `~gh/apache/kafka` will cd to the right place.
Thanks. I haven't considered these aliases, but they seam useful, so I just added them for my user. :-)
you can configure Alt+Left to go up level
Given the nature of current operating systems and applications, do you think the idea of “one tool doing one job well” has been abandoned? If so, do you think a return to this model would help bring some innovation back to software development?
Rob Pike: Those days are dead and gone and the eulogy was delivered by Perl.
But was the eulogy written in Perl poetry? I see it everywhere, but I don't know who this JAPH guy is. It's a strange way of spelling Jeff, and it's odd that he types his name in all caps, but he has published a remarkable quantity of works and he's even more famous than the anonymous hacker known as 4chan.
Oh I hate that paradigm. Well, maybe chmod and ls rsync and curl all do they OWN thing very well but every time I am using one of those tools I have to remember if i.e. more detailed response is -v or maybe -vvv or --verbose or -x for some reason because maintainer felt like it at 2:32 in the morning 17 years ago... Some consistency would help, but... Probably it is impossible the flame war over -R being recursive or read-only would never end.
I've heard this often, but I'm going on ~25 years of using Linux, and I would be lost without my dotfiles. They represent years of carefully crafting my environment to suit my preferences, and without them it would be like working on someone else's machine. Not impossible, just very cumbersome.
Admittedly, I've toned down the configs of some programs, as my usage of them has evolved or diminished, but many are still highly tailored to my preferences. For example, you can't really use Emacs without a considerable amount of tweaking. I mean, you technically could, but such programs are a blank slate made to be configured (and Emacs is awful OOB...). Similarly for zsh, which is my main shell, although I keep bash more vanilla. Practically the entire command-line environment and the choices you make about which programs to use can be considered configuration. If you use NixOS or Guix, then that extends to the entire system.
If you're willing to allow someone else to tell you how you should use your computer, then you might as well use macOS or Windows. :)
I prefer using kubectl than any other method so i have plenty of functions to help with that. I'd never consider using python or go for this although I do have plenty of python and go "scripts" on my path too.
It's the bell curve meme all along.
Different strokes for different folks - tenured engineers just settle into whatever works best for them.
The moment of true enlightenment is when you finally decide to once and for all memorize all the arguments and their order for those command line utilities that you use at an interval that's just at the edge of your memory: xargs, find, curl, rsync, etc.
That, plus knowing how to parse a man file to actually understand how to use a command (a skill that takes years to master) pretty much removes the need for most aliases and scripts.
Why would I even attempt to do that? Life is too short to try to remember something like that. Maybe 20 years ago when internet was not that common. Or maybe if you are a hacker, hacking other peoples machines. Me? Just some dev trying yo make some money to feed my family? I prefer to have a walk to the woods.
I already have limited space for long term memory, bash commands are very far down the list of things I'd want to append to my long term storage.
I use ctrl-R with a fuzzy matching program, and let my terminal remember it for me.
And before it's asked: yes that means I'd have more trouble working in a different/someone else's environment. But as it barely ever happens for me, it's hardly an important enough scenario to optimize for.
If you come through the other side, you set up LocalCommand in your .ssh/config which copies your config to every server you ssh to, and get your setup everywhere.
or just ask claude etc to do it for ya
Regarding the `line` script, just a note that sed can print an arbitrary line from a file, no need to invoke a pipeline of cat, head, and tail:
prints the second line of file. The advantage sed has over this line script is it can also print more than one line, should you need to: prints lines 2 through 4, inclusive.It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing
yeah I almost always start with `cat` but I still pipe it into `sed -n 1,4p`
True, everything depends on what one is trying to do at the time.
If you'd like to print the middle of a file, try out `body`: https://github.com/megamansec/body
While you're creating and testing aliases, it's handy to source your ~/.zshrc whenever you edit it:
I alias mdfind to grep my .docx files on my Mac: I use an `anon` function to anonymize my Mac clipboard when I want to paste something to the public ChatGPT, company Slack, private notes, etc. I ran it through itself before pasting it here, for example. It prints the new clipboard to stdout so you can inspect what you'll be pasting for anything it missed.ha! alias vz="vim ~/.zshrc && . ~.zshrc" is brilliant. Editing zshrc and sourcing is something I do pretty often. Never thought to alias
What's the difference between 'source' and '.' ?
I think they're the same except '.' is POSIX and 'source' is specific to bash and compatible shells. I personally just use source since it's easier to read and zsh and bash account for basically 100% of my shell usage.
nothing afaict
brilliant! this happens all the time and I never found a convenient way to manage
I use these two all the time to encode and cut mp4s.
The flags are for maximum compatibility (e.g. without them, some MP4s don't play in WhatsApp, or Discord on mobile, or whatever.)
ffmp4 foo.webm-> foo_sd.mp4
fftime foo.mp4 01:30 01:45-> foo_cut.mp4
Note, fftime copies the audio and video data without re-encoding, which can be a little janky, but often works fine, and can be much (100x) faster on large files. To re-encode just remove "-c copy"
Historical note: getting hold of these scripts by chatting to various developers was the motivation for the original 2004 "lifehacks" talk[1][2]. If you ever get into an online argument over what is a "life hack" and what isn't, feel free to use short scripts like these as the canonical example.
Otherwise, I am happy to be pulled into your discussion, Marshall McLuhan style[3] to adjudicate, for a very reasonable fee.
[1] https://craphound.com/lifehacksetcon04.txt
[2] https://archive.org/details/Notcon2004DannyOBrienLifehacks
[3] https://www.openculture.com/2017/05/woody-allen-gets-marshal...
I'm kicking myself for not thinking of the `nato` script.
I tend to try to not get too used to custom "helper" scripts because I become incapacitated when working in other systems. Nevertheless, I really appreciate all these scripts if nothing else than to see what patterns other programmers pick up.
My only addition is a small `tplate` script that creates HTML, C, C++, Makefile, etc. "template" files to start a project. Kind of like a "wizard setup". e.g.
And of course, three scripts `:q`, `:w` and `:wq` that get used surprisingly often:The one I use the most is "cdn". It cds to the newest subdirectory.
So if you're in your projects folder and want to keep working on your latest project, I just type "cdn" to go there.
My fav script to unpack anything, found a few years ago somewhere
`tar xf` autodetects compressed files now. You can replace any of your instances of tar with that.
Honestly, it doesn't need any updates, it works so great without any pain, I'm just happy with it
Yes, but only bsdtar has support for zip, rar, and 7z.
I use dtrx, which also ensures that all files are extracted into a folder.
That's brilliant. Now I need its compressing counterpart.
For compression, I have one for .tar.gz. But it's not that popular in my system. I need something a bit easier than 'pack file file file archive.tar.gz'
`aunpack` does the trick for me.
Very nice and clean
Now, add inotify and a systemd user service and you would be getting somewhere. Also packaged versions of that exist already.
So, you created a square wheel, instead of a NASA wheel.
One script I use quite often:
Prints the current date as UNIX timestamp. If you provide a UNIX timestamp as arg, it prints the arg as human readable date.Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.
sed, awk, grep, and xargs along with standard utilities get you a long long way.
Same. I interact with too many machines, many of which are ephemeral and will have been reprovisioned the next time I have to interact with it.
I value out of the box stuff that works most everywhere. I have a fairly lightweight zsh config I use locally but it’s mostly just stuff like a status like that suits me, better history settings, etc. Stuff I won’t miss if it’s not there.
I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.
The Gen AI tooling is exceptionally good at doing these sorts of things, and way more than just "mkdir $1 && cd $1". For example:
I have used it to build an "escmd" tool for interacting with Elasticsearch. It makes the available commands much more discoverable, the output it formats in tables, and gets rid of sending JSON to a curl command.
A variety of small tools that interact with Jira (list my tickets, show tickets that are tagged as needing ops interaction in the current release).
A tool to interact with our docker registry to list available tags and to modify tags, including colorizing them based on the sha hash of the image so it's obvious which ones are the same. We manage docker container deploys based on tags so if we "cptag stg prod" on a project, that releases the staging artifact to production, but we also tag them by build date and git commit hash, so we're often working with 5-7 tags.
Script to send a "Software has successfully been released" message via gmail from the command-line.
A program to "waituntil" a certain time to run a command: "waituntil 20:00 && run_release", with nice display of a countdown.
I have a problem with working on too many things at once and then committing unrelated things tagged with a particular Jira case. So I had it write me a commit program that lists my tickets, shows the changed files, and lets me select which ones go with that ticket.
All these are things I could have built before, but would have taken me hours each. With the GenAI, they take 5-15 minutes of my attention to build something like this. And Gen AI seems really, really great at building these small, independent tools.
> ocr my_image.png extracts text from an image and prints it to stdout. It only works on macOS
The Mac Shortcut at https://github.com/e-kotov/macos-shortcuts lets you select a particular area of the screen (as with Cmd-Shift-4) and copies the text out of that, allowing you to copy exactly the text you need from anywhere on your screen with one keyboard shortcut. Great for popups with unselectable text, and copying error messages from coworkers' screenshares.
I have a Linux equivalent that uses maim to select a region and then tesseract to do the OCR.
I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:
I find that I like working with the directory stack and having a shortened version of the directory stack in the title bar, e.g. by modifying the stock Debian .bashrc
The script shorten_ds.pl takes e.g. and shortens it to: That coupled with functions that take 'u 2' as shorthand for 'pushd +2' and 'o 2' for 'popd +2' make for easy manipulation of the directory stack:I have mkcd exactly ( I wonder how many of us do, it's so obvious)
I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.
I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.
I used to have one called timespeak that would speak the time to me every hour or half hour.
I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.
I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.
Obviously that script is more convenient, but if you’re on a system where you don’t have it, you can do the following instead:
I too have a `mkcd` in my .zshrc, but I implemented it slightly differently:
>I have mkcd exactly ( I wonder how many of us do, it's so obvious)
Mine is called "md" and it has "-p" on the mkdir. "mkdir -p $1 && cd $1"
Doesn’t the built in `take` do exactly what `mkcd` does? Or is `take` a zsh/macos specific thing?
Edit: looks like it’s a zsh thing
it's an .oh-my-zsh thing (~/.oh-my-zsh/lib/functions.zsh) but thanks, I didn't know about it.
One more from me:
I have three different way to open file with vim: v: vim (or neovim, in my case) vv: search/preview and open file by filename vvv: search/preview and open file by its content
A few I use are:
and I also use the comma-command pattern where I prefix my personal scripts with a `,` which allows me to cycle between them fast etc.One thing I have found that's worth it is periodically running an aggregation on one's history and purging old ones that I don't use.
With `xsel --clipboard` (put that in an alias like `clip`), you can use the same thing to replace both `copy` and `pasta`:
Nice! Tangentially related: I built a (MacOS only) tool called clippy to be a much better pbcopy. It was just added to homebrew core. Among other things, it auto-detects when you want files as references so they paste into GUI apps as uploads, not bytes.
https://github.com/neilberkman/clippy / brew install clippyAdding the word "then" to your first comment would have helped me: (lacking context, I thought the comments explained what the command does, as is common convention)
Also:Updated, I appreciate it!
Awesome. Gonna check this out.
This is really interesting, but I need the highlights reel. So I need a script to summarize Hacker News pages and/or arbitrary web pages. Maybe that's what I want for getting the juice out of Medium articles.
Broadly, I very much love this approach to things and wish it was more "acceptable?" It reminds me of the opposite of things like "the useless use of cat" which to me is one of the WORST meme-type-things in this space.
Like, it's okay -- even good -- for the tools to bend to the user and not the other way around.
Using 'copy' as a clipboard script tells me OP never lived through the DOS era I guess... Used to drive me mad switching between 'cp' in UNIX and 'copy' in DOS. (Same with the whole slash vs backslash mess.)
I like the NATO one.
It occurred to me that it would be more useful to me in Emacs, and that might make a fun little exercise.
And that's how I discovered `M-x nato-region` was already a thing.
For tempe, recommend changing "cd" to "push" do you can "popd" as soon as you're done.
My most used automation copies a file with rclone to backblaze blob storage, and puts the link into the clipboard. (for sharing memes)
and alias debian="docker run -it --rm -v $(pwd):/mnt/host -w /mnt/host --name debug-debian debian"
> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often (probably about once a month)
I genuinely wonder, why would anyone want to use this, often?
As a programmer, you sometimes want to make an alphabet lookup table. So, something like:
Typing it out by hand is error prone as it's not easy to see if you've swapped the order or missed a character.I've needed the alphabet string or lookup rarely, but I have needed it before. Some applications could include making your own UUID function, making a small random naming scheme, associating small categorical numbers to letters, etc.
The author of article mentioned they do web development, so it's not hard to imagine they've had to create a URL shortener, maybe more than once. So, for example, creating a small name could look like:
Dealing with strings, dealing with hashes, random names, etc., one could imagine needing to do functions like this, or functions that are adjacent to these types of tasks, at least once a month.Just a guess on my part though.
If your native language uses a different alphabet, you might not have been taught "the alphabet song". For example, I speak/read passable Russian, but could not alphabetize a list in Russian.
For me it's when I call customer service or support on the phone, and either give them an account #, or confirm a temporary password that I have been verbally given.
Are you referring to the nato alphabet utility? Or the alphabet script that prints
I imagine all of his passwords are abcdefghijklmnopqrstuvwxyz
I have a bunch of little scripts and aliases I've written over the years, but none are used more than these...
alias ..='cd ..'
alias ...='cd ../..'
alias ....='cd ../../..'
alias .....='cd ../../../..'
alias ......='cd ../../../../..'
alias .......='cd ../../../../../..'
In fish, I have an abbreviation that automatically expands double dots into ../ so that you can just spam double dots and visually see how far you're going.
I need this *so* often that I programmed my shell to execute 'cd ..' every time I press KP/ i.e. '/' on the keypad, without having to hit Return.
Other single-key bindings I use often are:
KP* executes 'ls'
KP- executes 'cd -'
KP+ executes 'make -j `nproc`'
How? Readline macros?
Literally with my own shell: https://github.com/cosmos72/schemesh
up() { local d="" for ((i=1; i<=$1; i++)); do d="../$d" done cd "$d" }
up 2, up 3 etc.
Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.
It's an oh-my-zsh thing.
Yes it does.
Not on my Mac.
fish lets you cd to a folder without 'cd' although you still need the slashes. I use it all the time.
I have setup a shortcut: alt+. to run cd.., it's pretty cool.
I also aliased - to run cd -
but alt-. in bash is used for pasting the last argument to the previous command into the current one.
Good point, when working with keybindings, you'll inevitably end up overriding built-ins. I see it as a trade-off, between something I don't know of (and wouldn't use) and something I find useful. Works for me :)
absolutely. From back in the day, the annoying one was GNU screen, which took over ctrl-a by default. Overrode that to be ctrl-^, which in bash is transpose, make "zx be "xz", which was rare enough to okay with losing.
My most important script has been to remap CapsLock as a kind of custom Meta key, that transforms (when pressed) the Space into Return, hjkl into arrows, io into PgUp/PgDn, and 1-9 into function keys. Now I have a 60% keyboard that takes 0 space on my desk. And I am reaaaally happy with this setup.
[that, plus LinkHint plugin for Firefox, and i3 for WM is my way to go for a better life]
17 years ago I wrote a short VBA macro that takes the high life’s range of cells, concatenates the values into a comma separated list, then opens the list in notepad for easy copy and further use. I can’t begin to count the number of executions by myself and those i have shared it with.
The most useful script I wrote is one I call `posh`. It shorten a file path by using environment variables. Example:
Of course, it only becomes useful when you define a bunch of environment variables for the paths that you use often.I use this a lot in all of my scripts. Basically whenever any of my script prints a path, it passes it through `posh`.
I'd love to see this script. Does it use `env` and strip out things like PWD?
I wrote it in a way that's too intertwined with my other shit to be shareable with people, but honestly you can copy-paste my comment to your friendly neighborhood LLM and you'll get something decent. Indeed it uses `env`.
Understood. I'd rather write it myself from scratch than use an LLM; confirmation of the general process should be enough, I hope!
> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.
You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.
> url "$my_url" parses a URL into its parts. I use this about once a month to pull data out of a URL, often because I don’t want to click a nasty tracking link.
This sounds pretty useful!
Coincidentally, I have recently learned that Daniel Stenberg et al (of cURL fame) wrote trurl[1], a libcurl-based CLI tool for URL parsing. Its `--json` option seems to yield similar results as TFA's url, if slightly less concise because of the JSON encoding. The advantage is that recent releases of common Linux distros seem to include trurl in their repos[2].
[1]: https://curl.se/trurl/
[2]: https://pkgs.org/search/?q=trurl
on my ubuntu, `date -I` does iso dates
Also re: alphabet
date -I even works on macOS, which I was pleasantly surprised by!
If you want the exact alphabet behaviour as the OP:
I have a script called catfiles that I store in ~/.local/bin that recursively dumps every source file with an associated file header so I can paste the resulting blob in to Gemini and ChatGPT in order to have a conversation about the changes I would like to make before I send off the resulting prompt to Gemini Code Assist.
Heres my script if anyone is interested in as I find it to be incredibly useful.
find . -type f \( -name ".tf" -o -name ".tfvars" -o -name ".json" -o -name ".hcl" -o -name ".sh" -o -name ".tpl" -o -name ".yml" -o -name ".yaml" -o -name ".py" -o -name ".md" \) -exec sh -c 'for f; do echo "### FILE: $f ###"; cat "$f"; echo; done' sh {} +
> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often
I'm curious to hear some examples (feel like I'm missing out)
I think it needs yt-dlp installed — and reasonably up to date, since YouTube keeps breaking yt-dlp... but the updates keep fixing it :)
On the subject of yt-dlp, I use it to get (timestamped) transcripts from YouTube, to shove into LLMs for summaries.
Where vtt2txt is a python script — slightly too long to paste here — which strips out the subtitle formatting, leaving a (mostly) human readable transcript.I started writing way more utility scripts when I found babashka. Magic of clojure, instant startup, easy to shell out to any other command, tons of useful built in stuff, developing with the REPL. It’s just a good time!!
I had my hopes on this project RawDog using local smol sized LLMs but it hasn't been updated in a while. I feel like all this should be running easily in the background nowadays.
https://github.com/AbanteAI/rawdog
Share yours!
I use this as a bookmarklet to grab the front page of the new york times (print edition). (You can also go back to any date up to like 2011)
I think they go out at like 4 am. So, day-of, note that it will fail if you're in that window before publishing.
I like this one.
The scripts from my junk drawer (https://github.com/peterwwillis/junkdrawer) I use every day are 'kd' and 'gw', which use the Unix dialog command to provide an easy terminal UI for Kubectl and Git Worktrees (respectively)... I probably save 15+ minutes a day just flitting around in those UIs. The rest of the scripts I use for random things; tasks in AWS/Git/etc I can never remember, Terraform module refactoring, Bitbucket/GitHub user management, Docker shortcuts, random password generation, mirroring websites with Wget, finding duplicate files, etc.
Obviously, to each their own, but to me, this is an overwhelming amount of commands to remember on top of all the ones they are composed of that you will likely need to know anyway — regardless if all the custom ones exist.
Like, I'd have to remember both `prettypath` and `sed`, and given that there's hardly any chance I'll not need `sed` in other situations, I now need to remember two commands instead of one.
On top of that `prettypath` only does s/:/\\n/ on my path, not on other strings, making its use extremely narrow. But generally doing search and replace in a string is incredibly useful, so I'd personally rather just use `sed` directly and become more comfortable with it. (Or `perl`, but the point is the same.)
As I said, that's obviously just my opinion, if loads of custom scripts/commands works for you, all the more power to you!
Please note that 'each' is fundamentally different from 'xargs'.
is the same as while is the same as I would rather say that 'each' replaces (certain uses of) 'for':It's equivalent to xargs -I {} rm {}
> `rn` prints the current time and date using date and cal.
And you can type `rn -rf *` to see all timezones recursively. :)
I did something similar with copy until I found this which works across remote terminals too:
`alias clip="base64 | xargs -0 printf '\e]52;c;%s\007'"`
It just sends it to the client’s terminal clipboard.
`cat thing.txt | clip`
and here's me still ctrl+r-ing for my commonly used methods
hopefully with fzf and not with the built in ctrl r
Here are some snippets that we've compiled over time:
https://snhps.com
They're not all necessarily the most efficient/proper way to accomplish a task, but they're nice to have on hand and be able to quickly share.
Admittedly, their usefulness has been diminished a bit since the rise of LLMs, but they still come in handy from time to time.
I've started using snippets for code reviews, where I find myself making the same comments (for different colleagues) regularly. I have a keyboard shortcut opening a fuzzy search to find the entry in a single text file. That saves a lot of time.
As an aside, I find most of these commands very long. I tend to use very short aliases, ideally 2 characters. I'm assuming the author uses tab most of the time, if the prefixes don't overlap beyond 3 characters it's not that bad, and maybe the history is more readable.
One of my biggest headaches is stripping specific number of bytes from the head or tail of a binary file. and I couldn't find any built-in tool for that, so I wrote one in C++.
Last X bytes: dd bs=1 skip=X
First X bytes: dd bs=X count=1
Thanks, there were few errors after testing.
1. stripping fist X bytes: dd bs=1 skip=X
2. stripping last X bytes: truncate -s -X
`line 10` can be written as `sed -n 10p` (instead of head+tail)
My most used function is probably the one I use to find the most recent files:
These are great, and I have a few matching myself.
Here are some super simple ones I didn't see that I use almost every day:
cl="clear"
g="git"
h="history"
ll="ls -al"
path='echo -e ${PATH//:/\\n}'
lv="live-server"
And for common navigation:
dl="cd ~/Downloads"
dt="cd ~/Desktop"
I'm terrible about remembering shortcuts (edit a bash line in an editor? Can never remember it) but clear (CTRL-l) is one that really stuck.
That and exit (CTRL-d). A guy I used to work with just mentioned it casually and someone it just seared itself into my brain.
k=kubectl
Love this, lots of great ideas I'll be stealing :)
Folks interested in scripting like this might like this tool I'm working on https://github.com/amterp/rad
Rad is built specifically for writing CLI scripts and is perfect for these sorts of small to medium scripts, takes a declarative approach to script arguments, and has first-class shell command integration. I basically don't write scripts in anything else anymore.
> `nato bar` returns Bravo Alfa Romeo. I use this most often when talking to customer service and need to read out a long alphanumeric string, which has only happened a couple of times in my whole life. But it’s sometimes useful!
Even more useful is just learning the ICAO Spelling Alphabet (aka NATO Phonetic Alphabet, of which it is neither). It takes like an afternoon and is useful in many situations, even if the receiver does not know it.
Some time ago I tried to tell my email address to someone in Japan over the phone who did not speak English very well. It turned out to be basically impossible. I realized later one could probably come up with a phonetic alphabet of English words most Japanese know!
I got a ccurl python script that extracts the cookies from my Firefox profile and then passes those on to curl, that way I can get webpages where I'm logged in.
As a fun game, I suggest feeding the entire piece to an LLM and asking it to create those scripts. The differences between Claude, GOT-5 and Gemini are very interesting.
As a bonus, I prepend my custom aliases or scripts with my user name and hyphen (i.e helicaltwine-). It helps me recall rarely used scripts when I need them and forget the names.
I follow a similar but more terse pattern. I prepend them all with a comma, and I have yet to find any collisions. If you're using bash (and I assume posix sh as well), the comma character has no special meaning, so this is quite a nice use for it. I agree that it's nice to type ",<tab>" and see all my custom scripts appear.
This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.
Nice. I have a bash script similar to the one listed "removeexif" called prep_for_web which takes any image file (PNG, BMP, JPG, WebP), scrubs EXIF data, checks for transparency and then compresses it to either JPG using MozJPEG or to PNG using PNGQuant.
[1] https://github.com/mozilla/mozjpeg
[2] https://pngquant.org
An important advantage of aliases was not mentioned: I see everything in one place and can easily build aliases on top of other aliases without much thinking.
Anyways, my favourite alias that I use all the time is this:
It solves the ,,not loaded automatically'' part at least for the current terminalI also have a radio script to play internet streams with mpv (?). Other random stuff
A password or token generator, simple or complicated random text.
Scripts to list, view and delete mail messages inside POP3 servers
n, to start Nautilus from terminal in the current directory.
lastpdf, to open the last file I printed as PDF.
lastdownload, to view the names of the n most recent files in the Downloads directory.
And many more but those are the ones that I use often and I remember without looking at ~/bin
Mkdir then cd into it, I just use ‘take’? Maybe this isn’t available by default everywhere?
It's an omz thing.
Interesting, but none of the links are working... codeberg.org isn't responding, it just spins forever.
A couple more standard approaches with fewer chars:
jsonformat -> jq
running -> pgrep
fish abbreviations >> bash aliases
Thank you, I also stopped using aliases and have everything as scripts in my ~/bin
absolutely love these time savers!!
Why dont we have mkcd in linux natively boggles my mind :)
Likewise, why doesn't git clone automatically cd into the repo?
A subprocess (git) can't modify the working directory of the parent process (the shell). This is a common annoyance with file managers like yazi and ranger as well—you need an extra (usually manual!) installation step to add a shell integration for whichever shell you're using so the shell itself can change directory.
The best solution for automatically cd'ing into the repo is to wrap git clone in a shell function or alias. Unfortunately I don't think there's any way to make git clone print the path a repository was cloned to, so I had to do some hacky string processing that tries to handle the most common usage (ignore the "gh:" in the URL regex, my git config just expands it to "git@github.com:"):
https://github.com/Andriamanitra/dotfiles/blob/d1aecb8c37f09...
zsh has a `take` utility that is exactly this
Lately I’ve been using caffeinate to run long running scripts without interruption from sleep on Mac. Nothing crazy but could be useful to newer devs.
These arent bad but much better if they were all flags to the cat command.
E.g. cat --copy
if you use x0vnc (useful if you use a linux machine both from the attached screen and from vnc, and in a bunch of other scenarios), copy and paste to and fro the vnc client is not implemented, quite frustrating. here's 2 scripts that does that for you, I now use this all day. https://github.com/francoisp/clipshare
Some of these, especially text processing, is already built-in Nushell.
4:20 PM - nice
A lot of these scripts could be just shell aliases.
OP links to a long blog post with reasons for using scripts
https://evanhahn.com/why-alias-is-my-last-resort-for-aliases...
It's been a while since I haven't read something as useful!
There also some very niche stuff that I won't use but found funny
The nato phonetic alphabet one cracked me up. My dude you don't need that, call center employees don't know it, just say S as in Sugar like ur grandma used to.
The NATO alphabet is made of commonly known words that are hard to misspell and hard to mishear (at least the initial phonemes). The person on the other end doesn't need to be able to recite the words, they just need to be able to hear "november" and recognize that it starts with N.
Yes thank you that's a good description of what a phonetic alphabet is and how it's used.
The nato phonetic alphabet is still useful even if the other party doesn't know it, I've used it a bunch of times on the phone to spell out my 10- letter last name. Saves quite a lot of time and energy for me vs saying "letter as in word" for each letter.
Exactly. The listening party doesn't need to have knowledge of the NATO alphabet to still benefit from it since they are just regular English words.
I once had someone sound out a serial number over a spotty phone connection years ago and they said "N as in NAIL". You know what sounds a lot like NAIL? MAIL.
And that is why we don't just arbitrarily make up phonetic alphabets.
> saying "letter as in word" for each letter
Which often just confuses things further.
Me: My name is "Farb" F-A-R-B. B as in Baker.
Them: Farb-Baker, got it.
"M as in Mancy."
Right but it's not much more useful than any other phonetic alphabet the other party doesn't know, including the one you make up on the spot.
If you're me, it's still useful because the ones I make up on the spot aren't great.
"S-T-E-V-E @ gmail.com, S as in sun, T as in taste, ..." "Got it, fpeve."
I dunno, there's a pretty good chance that the one that people spent time and effort designing to replace earlier efforts with the goal of reducing potential ambiguity and for use over noisy connections with expectation that mistakes could cost lives is probably better than what you improvise on the spot
When I worked in customer service, I asked a teammate what I could do to spell back something the customer said, and she taught me that system, it helped me a lot.
I once had the customer service agent for Iberia (the Spanish airline) confirm my confirmation number with me using it.
It worked with me and I guess it must have usually worked for him in most of his customer interactions.
I've found the NATO alphabet fairly common at call centers, with globalization being a factor.
Does anyone have links to other awesome articles like this?
Also this thread and the original article it’s about:
https://news.ycombinator.com/item?id=42057431
This thread was good:
https://news.ycombinator.com/item?id=31928736
Where are the one letter aliases? My life got better after I alias k=kubectl
30% of the productivity hacks can be archived in vanilla Nushell.
The markdownquote can be replaced by (at least in vim):
^ (jump to the beginning)
ctrl+v (block selection)
j (move cursor down)
shift+i (bulk insert?)
type ><space>
ESC
:'<,'>s/.*$/> \0/ also
or even better:
Duh, yeah youre right. This wins.
Love this. Gunna use plenty of these
in oh-my-zsh you can use `take` to do what mkcd does.
I had youtube and serveit and some others, but pasta is really good, thanks!
Last month I saw a tweet how to serve files using
python3 -m http.server 1337
Then I turned it into an alias, called it "serveit" and tweeted about it. And now I see it as a bash script, made a little bit more robust in case python is not installed :)
that was beautiful to read. command line ftw!
`perldoc perlrun`
Not really a one liner but this comes in handy for quick dns on multiple hostnames:
https://gist.github.com/jgbrwn/7dd4b262c544f750cb0291161b2ec...
(actually avoids having to do a one liner like: for h in {1..5}; do dig +short A mail”${h}”.domain.com @1.1.1.1 )
Hmm speaking of which I need to add in support for using a specific DNS server
this is really great. at some point i gavenup on being more efficient on the terminal, but many pain points are solved by your work
mksh is already the MirBSD Korn SHell
Which very very few people have actually installed on their system.
The "scripts" I use the most that I am most happy with are a set of Vagrant tools that manage initialising different kinds of application environments with an apt cache on the host. Also .ssh/config includes to make it as easy as possible to work with them from VSCode.
I set this stuff up so long ago I sort of forgot that I did it at all; it's like a standard feature. I have to remember I did it.
> wifi toggle
this fella doesn't know what "toggle" means. in this context, it means "turn off if it's currently on, or turn on if it's currently off."
this should be named `wifi cycle` instead. "cycle" is a good word for turning something off then on again.
naming things is hard, but it's not so hard that you can't use the right word. :)
or wifi toggle-toggle!
I hope to see an operating system with these scripts as built-in, because they are so intuitive and helpful! Which OS will be the first to take this on?
Looks very useful!
cool collection
no offense but a lot of those script are pretty hacky they may work for the user but i would not use them without making sure to review them and adapt them to my workflow
That's a fair point. I think the author intended the post to be a treasure trove of ideas for your own scripts, not as something to blindly include in your daily workflow.