aktau 8 hours ago

I have a bunch, but one that I rarely see mentioned but use all the time is memo(1) (https://github.com/aktau/dotfiles/blob/master/bin/memo).

It memoizes the command passed to it.

  $ memo curl https://some-expensive.com/api/call | jq . | awk '...'
Manually clearing it (for example if I know the underlying data has changed:

  $ memo -c curl https://some-expensive.com/api/call
In-pipeline memoization (includes the input in the hash of the lookup):

  $ cat input.txt | memo -s expensive-processor | awk '...'
This allows me to rapidly iterate on shell pipelines. The main goal is to minimize my development latency, but it also has positive effects on dependencies (avoiding redundant RPC calls). The classic way of doing this is storing something in temporary files:

  $ curl https://some-expensive.com/api/call > tmpfile
  $ cat tmpfile | jq . | awk '...'
But I find this awkward, and makes it harder than necessary to experiment with the expensive command itself.

  $ memo curl https://some-expensive.com/api/call | jq . | awk '...'
  $ memo curl --data "param1=value1" https://some-expennsive.com/api/call | jq . | awk '...'
Both of those will run curl once.

NOTE: Currently environment variables are not taken into account when hashing.

  • aabdelhafez 3 hours ago

    You're gonna absolutely love up (https://github.com/akavel/up).

    If you pipe curl's output to it, you'll get a live playground where you can finesse the rest of your pipeline.

      $ curl https://some-expensive.com/api/call | up
    • aktau 2 hours ago

      up(1) looks really cool, I think I'll add it to my toolbox.

      It looks like up(1) and memo(1) have similar use cases (or goals). I'll give it a try to see if I can appreciate its ergonomics. I suspect memo(1) will remain my mainstay:

        1. After executing a pipeline, I like to press the up arrow (heh) and edit. Surprisingly often I need to edit something that's *not* the last part, but somewhere in the middle. I find this cumbersome in default line editing mode, so I will often drop into my editor (^X^E) to edit the command.
        2. Up seems to create a shell command after completion. Avoiding the creation of extra files was one of my goals for memo(1). I'm sure some smart zsh/bash integration could be made that just returns the completed command after completing.
  • aktau 5 hours ago

    Another thing I built into memo(1) which I forgot to mention: automatic compression. memo(1) will use available (de)compressors (in order of preference: zstd, lz4, xz, gzip) to (de)compress stored contents. It's surprising how much disk space and IOPS can be saved this way due to redundancy.

    I currently only have two memoized commands:

      $ for f in /tmp/memo/aktau/* ; do 
          ls -lh "$f" =(zstd -d < $f) 
        done
      -rw-r----- 1 aktau aktau  33K /tmp/memo/aktau/0742a9d8a34c37c0b5659f7a876833b6dad9ec689f8f5c6065d05f8a27d993c7bbcbfdc3a7337c3dba17886d6f6002e95a434e4629.zst
      -rw------- 1 aktau aktau 335K /tmp/zshSQRwR9
    
      -rw-r----- 1 aktau aktau  827 /tmp/memo/aktau/8373b3af893222f928447acd410779182882087c6f4e7a19605f5308174f523f8b3feecbc14e1295447f45b49d3f06da5da7e8d7a6.zst
      -rw------- 1 aktau aktau 7.4K /tmp/zshlpMMdo
    
    That's roughly 10x compression ratio.
  • dotancohen 6 hours ago

    This is terrific! I curl to files and then pipe them, all the time. This will be a great help.

    I wonder if we have gotten to the point where we can feed an LLM our bash history and it could suggest improvements to our workflow.

  • Perepiska 5 hours ago

    Caching some API call because it is expensive and use cached data many months later because of bash suggestion :(

    • aktau 3 hours ago

      The default storage location for memo(1) output is /tmp/memo/${USER}. Most distributions either have some automatic periodic cleanup, and/or wipe it on restart.

      Separately from that:

        - The invocation contains *memo* right in there, so you (the user) knows that it might memoize.
        - One uses memo(1) for commands that are generally slow. Rerunning your command that has a slow part and having it return in a millisecond while you weren't expecting it should make the spider-sense tingle.
      
      In practice, this has never been a problem for me, and I've used this hacked together command for years.
  • gavinray 7 hours ago

    15 years of Linux and I learn something new all the time...

    • mlrtime 5 hours ago

      Its why I keep coming back, now how do I remember to use this and not go back to using tmpfiles :)

      • divan 5 hours ago

        I use Warp terminal for couple of years, and recently they embeeded AI into it. At first I was irritated, disabled it, but AI Agent is built in as an optional mode (Cmd-I to toggle). And I found myself using it more and more often for commands that I have no capacity or will to remember or dig through the man pages (from "figure out my IP address on wifi interface" to "make ffmpeg do this or that"). It's fast and can iterate over own errors, and now I can't resist using it regularly. Removes the need for "tools to memorize commands" entirely.

  • cryptonector an hour ago

    > `curl ... | jq . | awk '...'`

    Uhm, jq _is_ as powerful (more) as awk. You can use jq directly and skip awk.

    (I know, old habits die hard, and learning functional programming languages is not easy.)

  • naikrovek 39 minutes ago

    i see no way to name the memo in your examples, so how do you refer to them later?

    also, this seems a lot like an automated way to write shell scripts that you can pipe to and from. so why not use a shell script that won't surprise anyone instead of this, which might?

  • sgarland 4 hours ago

    Dude, this is _awesome_. Thank you for sharing!

    • aktau 3 hours ago

      Glad you like it. Hope you get as much use of it as me.

latexr a day ago

> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.

The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.

> jsonformat takes JSON at stdin and pretty-prints it to stdout.

Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.

> uuid prints a v4 UUID. I use this about once a month.

Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?

https://www.man7.org/linux/man-pages/man1/uuidgen.1.html

  • tester457 21 hours ago

    I am not the author, but my bet is that he didn't know of its existence.

    The best part about sharing your config or knowledge is that someone will always light up your blind spots.

    • t_mahmood 7 hours ago

      > The best part about sharing your config or knowledge is that someone will always light up your blind spots.

      Yes! I will take this as a chance to thank every people who shared their knowledge on the Internet. You guys are so freaking awesome! You are always appreciated.

      A big chunk of my whole life's learning came from all the forums that I used to scour through, hours after hour! Because these awesome people always sharing their knowledge, and someone adding more. That's what made Internet, Internet. And all is now almost brink of loss, because of greedy corporates.

      This habit also helped me with doom-scrolling. I sometimes, do doomscroll, but I can catch it quickly and snap out of it. Because, my whole life, I always jumped in to the rabbit holes, and actually read those big blog posts, where you had those `A-ha` moments, "Oohh, I can use that", "Ahh, that's clever!".

      When, browsing, do not give me that, by brain actually triggers, "What are you doing?"

      Later, I got lazy, which I am still paying for. But I am going to get out of it.

      Never stop jumping into those rabbit holes!! Well, obviously, not always it's a good rabbit hole, but you'll probably come out wiser.

    • _kb 18 hours ago

      Or more abstractly: post anything to the internet and people will always detail how you’re wrong. Sometimes that can be useful.

      • byryan 16 hours ago

        That seems to be especially true on HN. Other forums there is some of that as well, but HN it seems nearly every single comment section is like 75% (random number) pointing out faults in the posted article.

        • gaudystead 16 hours ago

          Although I normally loathe pedantic assholes, I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).

          I've found that pedantic conversations here seem to actually have a greater potential for me to learn something from them than other forums/social platforms. On other platforms, I see someone providing a pedantic response and I'll just keep moving on, but on HN, I get curious to not only see who wins the nerd fight, but also that I might learn at least one thing along the way. I like that it's had an effect on how I engage with comment sections.

          • password4321 15 hours ago

            And the worst of it gets flagged and even dead-ed so most skip it after a bit, as I assumed would happen recently

            https://news.ycombinator.com/item?id=45649771

            • imcritic 8 hours ago

              Yes, flagging mechanism on HN is evil.

              • MyOutfitIsVague 3 hours ago

                I have showdead on, and almost every single flagged post I've seen definitely deserves it. Every time it wasn't "deserved", the person simply took an overly aggressive tone for no real reason.

                In short, I've never seen somebody flagged simply for having the wrong opinion. Even controversial opinions tend to stay unflagged, unless they're incredibly dangerous or unhinged.

                • lolc 2 hours ago

                  I've seen a few dead posts where there was an innocent misunderstanding or wrong assumption. In those cases it would have been beneficial to keep the post visible and post a response, so that readers with similarly mistaken assumptions could have seen a correction. Small minority of dead posts though. They can be vouched for actually but of course this is unlikely to happen.

                  I agree that most dead posts would be a distraction and good to have been kept out.

              • kergonath 5 hours ago

                It’s a blunt tool, but quite useful for posts. I read most dead posts I come across and I don’t think I ever saw one that was not obviously in violation of several guidelines.

                OTOH I don’t like flagging stories because good ones get buried regularly. But then HN is not a great place for peaceful, nuanced discussion and these threads often descend into mindless flame wars, which would bury the stories even without flagging.

                So, meh. I think flagging is a moderately good thing overall but it really lacks in subtlety.

          • nosianu 8 hours ago

            > I've found the ones on HN seem to be more tolerable because they typically know they'll have to back up what they're saying with facts (and ideally citations).

            Can you back this up with data? ;-)

            I see citations and links to sources about as little as on reddit around here.

            The difference I see is in the top 1% comments, which exist in the first place, and are better on average (but that depends on what other forums or subreddits you compare it to, /r/AskHistorians is pretty good for serious history answers for example), but not in the rest of the comments. Also, less distractions, more staying on topic, the joke replies are punished more often and are less frequent.

        • bdangubic 16 hours ago

          I find that endearing for two reasons:

          - either critique is solid and I learn something

          - or commenter is clueless which makes it entertaining

          there is very seldom a “middle”

          • byryan 16 hours ago

            Yea I don't particularly mind it, just an interesting thing about HN compared to many other forums.

      • mlrtime 5 hours ago

        True true, one of my favorite things is watching the shorts on home improvement or 'hacks' and sure enough there is always multiple comments saying why it won't work and why its not the right way. Just as entertaining as the video.

    • gigatexal 19 hours ago

      Exactly! I didn’t know macOS ships JQ or the uuidgen tool. Very cool

    • dylan604 16 hours ago

      also possible (even though I've seen the author's response to not knowing) is that the scripts were written before native was included. at that point, the muscle memory is just there. I know I have a few scripts like that myself

  • lkbm an hour ago

    > Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.

    That was my thought. I use jq to pretty print json.

    What I have found useful is j2p and p2j to convert to/from python dict format to json format (and pretty print the output). I also have j2p_clip and p2j_clip, which read from and then write to the system clipboard so I don't have to manually pipe in and out.

    > Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?

    I also made a uuid, which just runs uuidgen, but then trims the \n. (And maybe copied to clipboard? It was at my old job, and I don't seem to have saved it to my personal computer.)

  • rbonvall 21 hours ago

    Python also pretty-prints out of the box:

        $ echo '{ "hello": "world" }' | python3 -m json.tool
        {
            "hello": "world"
        }
  • idoubtit 17 hours ago

    Other examples where native features are better than these self-made scripts...

    > vim [...] I select a region and then run :'<,'>!markdownquote

    Just select the first column with ctrl-v, then "i> " then escape. That's 4 keys after the selection, instead of 20.

    > u+ 2025 returns ñ, LATIN SMALL LETTER N WITH TILDE

    `unicode` is widely available, has a good default search, and many options. BTW, I wonder why "2025" matched "ñ".

         unicode ñ
        U+00F1 LATIN SMALL LETTER N WITH TILDE
        UTF-8: c3 b1 UTF-16BE: 00f1 Decimal: &#241; Octal: \0361
    
    > catbin foo is basically cat "$(which foo)"

    Since the author is using zsh, `cat =foo` is shorter and more powerful. It's also much less error-prone with long commands, since zsh can smartly complete after =.

    I use it often, e.g. `file =firefox` or `vim =myscript.sh`.

    • oneeyedpigeon 7 hours ago

      > `unicode` is widely available

      It's not installed by default on macOS or Ubuntu, for me.

      • pmontra 3 hours ago

        You are right but

          $ unicode
          Command 'unicode' not found, but can be installed with:
          sudo apt install unicode
        
        and it did. So it really was available. That's Debian 11.
  • mmmm2 20 hours ago

    `trash` is good to know, thanks! I'd been doing: "tell app \"Finder\" to move {%s} to trash" where %s is a comma separated list of "the POSIX file <path-to-file>".

    • gcanyon 12 hours ago

      Oooh, I just suggested in another comment that using applescript would be possible. I didn't think it would be this easy though.

  • gcanyon 12 hours ago

    I believe it would be possible to execute an applescript to tell the finder to delete the files in one go. It would theoretically be possible to construct/run the applescript directly in a shell script. It would be easier (but still not trivial) to write an applescript file to take the file list as an argument to then delete when calling from the shell.

    • latexr 9 hours ago

      It’s not theoretical, and it is trivial. Like I said, I did exactly that for years. Specifically, I had a function in my `.zshrc` to expand all inputs to their full paths, verify and exclude invalid arguments, trash the rest in one swoop, then show me an error with the invalid arguments, if any.

  • true_religion 13 hours ago

    Trash command first appeared in macOS 14.0, which was 2023.

  • energy123 8 hours ago

    I do `mv a.txt /tmp` instead of `rm`.

  • shortrounddev2 20 hours ago

    > Why prioritise node instead of jq?

    In powershell I just do

        > echo '{"foo": "bar"} | ConvertFrom-Json | ConvertTo-Json
        {
            "foo": "bar"
        }
    
    But as a function
  • sedatk 21 hours ago

    and it's `New-Guid` in PowerShell.

  • YouAreWRONGtoo 19 hours ago

    Instead of trash, reimplementing rm (to only really delete after some time or depending on resource usage or to shred of you are paranoid if the goal is to really delete something) or using zfs makes much more sense.

    • orhmeh09 18 hours ago

      I can't imagine a scenario where I would want to reimplement rm just for this.

      • YouAreWRONGtoo 8 hours ago

        [flagged]

        • latexr 8 hours ago

          https://news.ycombinator.com/newsguidelines.html

          > Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

          Instead of being rude to a fellow human making an inoffensive remark, you could’ve spent your words being kind and describing the scenario you claim exists. For all you know, maybe they did ask ChatGPT and were unconvinced by the answer.

          As a side note, I don’t even understand how your swipe would make sense. If anything, needing ChatGPT is what demonstrates a lack of imagination (having the latter you don’t need the former).

          • YouAreWRONGtoo 2 hours ago

            What makes you think I need ChatGPT, since I just wondered whether ChatGPT was as stupid, since obviously I do know why that would be useful?

    • thiht 8 hours ago

      How is this better?

  • frumplestlatz 19 hours ago

    For trash on macOS, I recommend https://github.com/ali-rantakari/trash

    Does all the right things and works great.

    There’s a similar tool that works well on Linux/BSDs that I’ve used for years, but I don’t have my FreeBSD desktop handy to check.

elric 7 hours ago

I've written on this before, but I have an extensive collection of "at" scripts. This started 25+ years ago when I dragged a PC tower running BSD to a friend's house, and their network differed from mine. So I wrote an @friend script which did a bunch of ifconfig foo.

Over time that's grown to an @foo script for every project I work on, every place I frequent that has some kind of specific setup. They are prefixed with an @ because that only rarely conflicts with anything, and tab-complete helps me remember the less frequently used ones.

The @project scripts setup the whole environment, alias the appropriate build tools and versions of those tools, prepare the correct IDE config if needed, drop me in the project's directory, etc. Some start a VPN connection because some of my clients only have git access over VPN etc.

Because I've worked on many things over many years, most of these scripts also output some "help" output so I can remember how shit works for a given project.

Here's an example:

    # @foo
    
    PROJECT FOO
    -----------
    
    VPN Connection: active, split tunnel
    
    Commands: 
    tests: mvn clean verify -P local_tests
    build all components: buildall
    
    Tools:
    java version: 17.0.16-tem
    maven version: 3.9.11
Edit: a word on aliases, I frequently alias tools like maven or ansible to include config files that are specific to that project. That way I can have a .m2 folder for every project that doesn't get polluted by other projects, I don't have to remember to tell ansible which inventory file to use, etc. I'm lazy and my memory is for shit.
  • blixt 6 hours ago

    Slightly related but mise, a tool you can use instead of eg make, has “on enter directory” hooks that can reconfigure your system quite a bit whenever you enter the project directory in the terminal. Initially I was horrified by this idea but I have to admit it’s been quite nice to enter into a directory and everything is set up just right, also for new people joining. It has built in version management of just about every command line tool you could imagine, so that an entire team can be on a consistent setup of Python, Node, Go, etc.

    • blixt 3 hours ago

      I see other people mentioning env and mise does this too, with additional support to add on extra env overrides with a dedicated file such as for example .mise.testing.toml config and running something like:

      MISE_ENV=testing bun run test

      (“testing” in this example can be whatever you like)

    • nullwarp 5 hours ago

      This is very useful to me and I had no idea, thanks for pointing that feature out!

  • mlrtime 5 hours ago

    I'm stealing the top comment here because you probably know what I'm asking.

    I've always wanted a linux directory hook that runs some action. Say I have a scripts dir filled with 10 different shells scripts. I could easily have a readme or something to remember what they all do.

    What I want is some hook in a dir that every time I cd into that dir it runs the hook. Most of the time it would be a simple 'cat usage.txt' but sometimes it maybe 'source .venv/bin/activate'.

    I know I can alias the the cd and the hook together but I don't want that.

    • eadmund 5 hours ago

      I recommend direnv for that: https://direnv.net/

      Its intended use case is loading environment variables (you could use this to load your virtualenv), but it works by sourcing a script — and that script can be ‘cat usage.txt.’

      Great tool.

      If you use Emacs (and you should!), there’s a direnv mode. Emacs also has its own way to set configuration items within a directory (directory-local variables), and is smart enough to support two files, so that there can be one file checked into source control for all members of a project and another ignored for one’s personal config.

    • hellcow 5 hours ago

      direnv does exactly what you describe (and a lot more) using flake.nix. cd into the directory and it automatically runs. I use it in every single project/repository to set environment variables and install project-specific dependencies locked to specific versions.

      • eadmund 5 hours ago

        > direnv does exactly what you describe (and a lot more) using flake.nix

        Direnv is awesome! Note, thought, that it does not depend on Nix, just a Unix-like OS and a supported shell: https://direnv.net/#prerequisites

    • oulipo2 4 hours ago

      As other comments say, direnv does that, but honestly you should look into mise-en-place (mise) which is really great, and also includes a "mini-direnv"

soiltype a day ago

This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?

What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!

Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.

A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)

  • klaussilveira 5 hours ago

    This is one of the things I miss the most about hacker conferences. The sharing of tools, scripts, tips and tricks. It was, and still is, just as fun as trading cards.

  • chipsrafferty a day ago

    I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.

    • taejavu 19 hours ago

      Not all time is created equal. I’ll happily invest more time than I’ll ever get back in refining a script or vim config or whatever, so that later, when I’m busy and don’t have time to muck around, I can stay in the flow and not be annoyed by distractions.

    • karczex 20 hours ago

      Sometimes it's rather matter of sanity than time management. I once created systemd service which goes to company web page and downloads some files which I sometimes need. This script was pretty hacky, and writing it took me a lot of time - probably more than clicking manually on this page in the long run. But clicking it so annoying, that I feel it was totally worth.

    • latexr 9 hours ago

      > reference them when forgetting syntax

      If you have to do that, the script needs improvement. Always add a `--help` which explains what it does and what arguments it takes.

      • tom_ 5 hours ago

        If you write these sorts of things in Python, argparse is worth investigating: https://docs.python.org/3/library/argparse.html - it's pretty easy to use, makes it easy to separate the command line handling from the rest of the code, and, importantly, will generate a --help page for you. And if you want something it can't do, you can still always write the code yourself!

        • latexr 4 hours ago

          I don’t like Python in general, but even so I’ll say that argparse is indeed very nice. When I was writing ruby, I always felt that OptionParser¹ wasn’t as good. Swift has Argument Parser², officially from Apple, which is quite featureful. For shell, I have a a couple of bespoke patterns I have been reusing in every script for many years.

          ¹ https://github.com/ruby/optparse

          ² https://github.com/apple/swift-argument-parser

    • te_cima a day ago

      why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.

      • akersten a day ago

        It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/

        For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.

        • duskdozer 16 hours ago

          Maybe, but

          1. even if it costs more time, it could also save more annoyance which could be a benefit

          2. by publishing the scripts, anyone else who comes across them can use them and save time without the initial cost. similarly, making and sharing these can encourage others to share their own scripts, some of which the author could save time with

        • skydhash 18 hours ago

          Sometimes, you explore to have ideas. By fixing a few problems like these, you learn about technologies that can help you in another situation.

        • janalsncm 13 hours ago

          Not all time is created equally though, so I disagree with that xkcd.

          If something is time sensitive it is worth spending a disproportionate amount of time to speed things up at some later time. For example if you’re debugging something live, in a live presentation, working on something with a tight deadline etc.

          Also you don’t necessarily know how often you’ll do something anyways.

          • normie3000 9 hours ago

            > I disagree with that xkcd

            The xkcd doesn't seem to be pushing an agenda, just providing a lookup table. Time spent vs time saved is factual.

        • latexr 8 hours ago

          One thing which is often ignored in these discussions is the experience you gain. The time you “wasted” on your previous scripts by taking longer to write them compounds in time saved in the future because you can now write more complex tasks faster.

          • dbalatero an hour ago

            The problem is, to really internalize that benefit, one would need to have an open mind to trying things out, and many folks seem to resist that. Oh well, more brain connections for me I suppose.

        • kelvinjps10 21 hours ago

          I find that now with AI, you can make scripts very quickly, reducing the time to write them by a lot. There is still some time needed for prompting and testing but still.

        • r4tionalistmind 7 hours ago

              >YOU DON'T UNDERSTAND. I NEED TO BE CONSTANTLY OPTIMIZING MY UPTIME. THE SCIENCE DEMANDS IT. TIMEMAXXING. I CAN'T FREELY EXPLORE OR BRAINSTORM, IT'S NOT XKCD 1205 COMPLIANT. I MUST EVALUATE EVERY PROPOSED ACTIVITY AGAINST THE TIME-OPTIMIZATION-PIVOT-TABLE.
oceanplexian a day ago

It's weird how the circle of life progresses for a developer or whatever.

- When I was a fresh engineer I used a pretty vanilla shell environment

- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP

- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.

  • chis 21 hours ago

    I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).

    When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install

    • bigwheels 19 hours ago

      Atuin is new to me!

      https://github.com/atuinsh/atuin

      Discussed 4 months ago:

      Atuin – Magical Shell History https://news.ycombinator.com/item?id=44364186 - June 2025, 71 comments

      • auraham 17 hours ago

        I gave it a try a few months ago, but did not work for me. My main issue is that atuin broke my workflow with fzf (If I remember correctly, pressing ctrl + r to lookup my shell history did not work well after installing atuin).

        • TsiCClawOfLight an hour ago

          This is configurable! I use atuin, but fzf with ctrl-r.

        • bigwheels 16 hours ago

          I'm sympathetic, also a longtime fzf user here. I install it reflexively on any system I use for more than a day or two.

      • tacker2000 10 hours ago

        I like atuin but why is it so slow when first opening (hitting up) in the shell?

        • johntash 10 hours ago

          I'd recommend disabling atuin when hitting up and just leave it on ctrl+r instead

        • YouAreWRONGtoo 8 hours ago

          Either it wasn't a design goal or they are stupid. Why don't you tell us?

          The right way this would work is via a systemd service and then it should be instant.

    • heyitsguay 21 hours ago

      I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.

      • nijaru 20 hours ago

        You can apply your dotfiles to servers you SSH into rather easily. I'm not sure what your workflow is like but frameworks like zsh4humans have this built in, and there are tools like sshrc that handle it as well. Just automate the sync on SSH connection. This also applies to containers if you ssh into them.

        • theshrike79 20 hours ago

          I'm guessing you haven't worked in Someone Else's environment?

          The amount of shit you'll get for "applying your dotfiles" on a client machine or a production server is going to be legendary.

          Same with containers, please don't install random dotfiles inside them. The whole point of a container is to be predictable.

          • nijaru 19 hours ago

            Do you have experience with these tools? Some such as sshrc only apply temporarily per session and don't persist or affect other users. I keep plain 'ssh' separate from shell functions that apply dotfiles and use each where appropriate. You can also set up temporary application yourself pretty easily.

          • LinXitoW 7 hours ago

            In other replies you explicitly state how rare it is that you log in to other systems.

            Aren't you therefore optimizing for 1% of the cases, but sabotaging the 99%?

          • YouAreWRONGtoo 19 hours ago

            Someone else's environment? That should never happen. You should get your own user account and that's it.

            • mlrtime 5 hours ago

              Sometimes we need to use service accounts, so while you do have your own account all the interesting things happen in svc_foo which you cannot add your .files.

            • theshrike79 12 hours ago

              I don’t even get an account on someone else’s server. There’s no need for me to log in anywhere unless it’s an exceptional situation.

              • YouAreWRONGtoo 7 hours ago

                This doesn't make sense.

                You said you were already using someone else's environment.

                You can't later say that you don't.

                Whether or not shell access makes sense depends on what you are doing, but a well written application server running in a cloud environment doesn't need any remote shell account.

                It's just that approximately zero typical monolithic web applications meet that level of quality and given that 90% of "developers" are clueless, often they can convince management that being stupid is OK.

                • 1718627440 6 hours ago

                  They do get to work on someone else's server, they do not get a separate account on that server. There client would be not happy to have them mess around with the environment.

                  • YouAreWRONGtoo 2 hours ago

                    By definition, it the client Alice gives contractor Mallory access to user account alice, that's worse than giving them an account called mallory.

                    Accounts are basically free. Not having accounts; that's expensive.

          • fragmede 20 hours ago

            If, in the year 2025, you are still using a shared account called "root" (password: "password"), and it's not a hardware switch or something (and even they support user accounts these days), I'm sorry, but you need to do better. If you're the vendor, you need to do better, if you're the client, you need to make it an issue with the vendor and tell them they need to do better. I know, it's easy for me to say from the safety of my armchair at 127.0.0.1. I've got some friends in IT doing support that have some truly horrifying stories. But holy shit why does some stuff suck so fucking much still. Sorry, I'm not mad at you or calling you names, it's the state of the industry. If there were more pushback on broken busted ass shit where this would be a problem, I could sleep better at night, knowing that there's somebody else that isn't being tortured.

            • theshrike79 12 hours ago

              It’s 2025. I don’t even have the login password to any server, they’re not unicorns, they’re cattle.

              If something is wrong with a server, we terminate it and spin up a new one. No need for anyone to log in.

              In very rare cases it might be relevant to log in to a running server, but I haven’t done that in years.

      • tester457 17 hours ago

        The defaults are unbearable. I prefer using chezmoi to feel at home anywhere. There's no reason I can't at least have my aliases.

        I'd rather take the pain of writing scripts to automate this for multiple environments than suffer the death by a thousand cuts which are the defaults.

        • fragmede 16 hours ago

          chezmoi is the right direction, but I don't want to have to install something on the other server, I should just be able to ssh to a new place and have everything already set up, via LocalCommand and Host * in my ~/.ssh/config

  • trenchpilgrim a day ago

    When I had one nix computer, I wanted to customize it heavily.

    Now I have many nix computers and I want them consistent and with only the most necessary packages installed.

    • sestep 21 hours ago

      For anyone else reading this comment who was confused because this seems like the opposite of what you'd expect about Nix: Hacker News ate the asterisks and turned them into italics.

      • fragmede 20 hours ago

        use a backslash. \*

        (had to use a double backslash to render that correctly)

        • latexr 20 hours ago

          Or two consecutive asterisks: ** becomes *

    • ozim 21 hours ago

      Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.

      • soraminazuki 10 hours ago

        What does your wife, dog, children, chores, and shopping have to do with custom configuration and scripts? Just set up a Git repo online, put your files there, and take a couple of minutes to improve it incrementally when you encounter inconveniences. And just like that, you made your life easier for a marginal effort.

        • 1718627440 6 hours ago

          They compete for time.

          • mlrtime 5 hours ago

            Don't even try to explain the scripts to wife*, try the dog. At least he'll understand it just as much and be enthusiastic to hear it!

            *may not be applicable to all wives, ymmv.

            • TsiCClawOfLight an hour ago

              I thought my wife latex, she loves me for it :D

          • soraminazuki 5 hours ago

            I'm saying that makes no sense, as I've wrote in the comment you're replying to.

    • Ferret7446 9 hours ago

      I don't get why this is a problem. Just stick all your configs in a git repo and clone it wherever you need it.

  • D13Fd a day ago

    I would still call my Python scripts “scripts.” I don’t think the term “scripts” is limited to shell scripts.

  • planb a day ago

    Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.

  • heap_perms 10 hours ago

    I can't say I relate at all (5 years of experience). They'll have to pry my 1000-line .zshrc from my cold, dead hands. For example, zsh-autosuggestions improves my quality of life so ridiculously much it's not even funny.

    • cvak 10 hours ago

      I moved away from 1000 lines .zshrc when I had to do stuff on linux VMs/dockers and I was lost a lot. But you zsh-autosuggestions, and fzf-tab is not going anywhere.

  • jamesbelchamber a day ago

    I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."

    Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.

    (I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))

    • dbalatero an hour ago

      I've been programming 30 years and I really don't find it a hassle:

      - if you commit them to git, they last your entire career

      - improving your setup is basically compound interest

      - with a new laptop, my setup script might cause me 15 minutes of fixing a few things

      - the more you do it, the less any individual hassle becomes, and the easier it looks to make changes – no more "i don't have time" mindset

  • subsection1h 19 hours ago

    > When I was a fresh engineer I used a pretty vanilla shell environment. When I got a year or two of experience, I wrote tons of scripts

    Does this mean that you learned to code to earn a paycheck? I'm asking because I had written hundreds of scripts and Emacs Lisp functions to optimize my PC before I got my first job.

  • grimgrin 21 hours ago

    this is how it works for you

    as a person who loves their computer, my ~/bin is full. i definitely (not that you said this) do not think "everything i do has to be possible on every computer i am ever shelled into"

    being a person on a computer for decades, i have tuned how i want to do things that are incredibly common for me

    though perhaps you're referring to work and not hobby/life

  • eikenberry a day ago

    Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.

  • nonethewiser a day ago

    On the other hand, the author seems to have a lot of experience as well.

    Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.

    edit: In the case of the author I guess he's probably wants to live in the terminal full time. And perhaps offline. there is a lot of static data he's stored like http status codes: https://codeberg.org/EvanHahn/dotfiles/src/commit/843b9ee13d...

    In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.

  • mlrtime 5 hours ago

    For the Infra Engineers out there who still manage fleets of pets, this is double true. You may not have access or be able to use all your shortcut scripts so you better know the raw commands on that unsupported RHEL6 host.

  • shermanyo 17 hours ago

    I use a dotfile with aliases and functions, mostly to document / remember commands I find useful. It's been a handy way to build a living document of the utils I use regularly, and is easy to migrate to each new workstation.

  • dylan604 16 hours ago

    man, i couldn't live without alias ..='cd ..' or alias ...='cd ../..'

    to this day, i still get tripped up when using a shell for the first time without those as they're muscle memory now.

    • fiddlerwoaroof 15 hours ago

      I just use the autocd zsh shell option for this. And I also use `hash -d` to define shortcuts for common directories. Then just “executing” something like `~gh/apache/kafka` will cd to the right place.

    • 1718627440 6 hours ago

      Thanks. I haven't considered these aliases, but they seam useful, so I just added them for my user. :-)

    • 400thecat 3 hours ago

      you can configure Alt+Left to go up level

  • Mikhail_Edoshin 13 hours ago

    Given the nature of current operating systems and applications, do you think the idea of “one tool doing one job well” has been abandoned? If so, do you think a return to this model would help bring some innovation back to software development?

    Rob Pike: Those days are dead and gone and the eulogy was delivered by Perl.

    • president_zippy 11 hours ago

      But was the eulogy written in Perl poetry? I see it everywhere, but I don't know who this JAPH guy is. It's a strange way of spelling Jeff, and it's odd that he types his name in all caps, but he has published a remarkable quantity of works and he's even more famous than the anonymous hacker known as 4chan.

    • npodbielski 4 hours ago

      Oh I hate that paradigm. Well, maybe chmod and ls rsync and curl all do they OWN thing very well but every time I am using one of those tools I have to remember if i.e. more detailed response is -v or maybe -vvv or --verbose or -x for some reason because maintainer felt like it at 2:32 in the morning 17 years ago... Some consistency would help, but... Probably it is impossible the flame war over -R being recursive or read-only would never end.

  • imiric 19 hours ago

    I've heard this often, but I'm going on ~25 years of using Linux, and I would be lost without my dotfiles. They represent years of carefully crafting my environment to suit my preferences, and without them it would be like working on someone else's machine. Not impossible, just very cumbersome.

    Admittedly, I've toned down the configs of some programs, as my usage of them has evolved or diminished, but many are still highly tailored to my preferences. For example, you can't really use Emacs without a considerable amount of tweaking. I mean, you technically could, but such programs are a blank slate made to be configured (and Emacs is awful OOB...). Similarly for zsh, which is my main shell, although I keep bash more vanilla. Practically the entire command-line environment and the choices you make about which programs to use can be considered configuration. If you use NixOS or Guix, then that extends to the entire system.

    If you're willing to allow someone else to tell you how you should use your computer, then you might as well use macOS or Windows. :)

  • denimnerd42 a day ago

    I prefer using kubectl than any other method so i have plenty of functions to help with that. I'd never consider using python or go for this although I do have plenty of python and go "scripts" on my path too.

  • apprentice7 20 hours ago

    It's the bell curve meme all along.

  • stronglikedan 21 hours ago

    Different strokes for different folks - tenured engineers just settle into whatever works best for them.

  • russellbeattie 19 hours ago

    The moment of true enlightenment is when you finally decide to once and for all memorize all the arguments and their order for those command line utilities that you use at an interval that's just at the edge of your memory: xargs, find, curl, rsync, etc.

    That, plus knowing how to parse a man file to actually understand how to use a command (a skill that takes years to master) pretty much removes the need for most aliases and scripts.

    • npodbielski 4 hours ago

      Why would I even attempt to do that? Life is too short to try to remember something like that. Maybe 20 years ago when internet was not that common. Or maybe if you are a hacker, hacking other peoples machines. Me? Just some dev trying yo make some money to feed my family? I prefer to have a walk to the woods.

    • whatevertrevor 18 hours ago

      I already have limited space for long term memory, bash commands are very far down the list of things I'd want to append to my long term storage.

      I use ctrl-R with a fuzzy matching program, and let my terminal remember it for me.

      And before it's asked: yes that means I'd have more trouble working in a different/someone else's environment. But as it barely ever happens for me, it's hardly an important enough scenario to optimize for.

  • fragmede a day ago

    If you come through the other side, you set up LocalCommand in your .ssh/config which copies your config to every server you ssh to, and get your setup everywhere.

  • bdangubic 16 hours ago

    or just ask claude etc to do it for ya

southwindcg a day ago

Regarding the `line` script, just a note that sed can print an arbitrary line from a file, no need to invoke a pipeline of cat, head, and tail:

    sed -n 2p file
prints the second line of file. The advantage sed has over this line script is it can also print more than one line, should you need to:

    sed -n 2,4p file
prints lines 2 through 4, inclusive.
  • tonmoy a day ago

    It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing

    • 1-more 21 hours ago

      yeah I almost always start with `cat` but I still pipe it into `sed -n 1,4p`

    • southwindcg a day ago

      True, everything depends on what one is trying to do at the time.

Noumenon72 19 hours ago

While you're creating and testing aliases, it's handy to source your ~/.zshrc whenever you edit it:

    alias vz="vim ~/.zshrc && . ~/.zshrc"
I alias mdfind to grep my .docx files on my Mac:

    docgrep() {
      mdfind "\"$@\"" -onlyin /Users/xxxx/Notes 2> >(grep --invert-match ' \[UserQueryParser\] ' >&2) | grep -v -e '/Inactive/' | sort
    }
I use an `anon` function to anonymize my Mac clipboard when I want to paste something to the public ChatGPT, company Slack, private notes, etc. I ran it through itself before pasting it here, for example.

    anonymizeclipboard() {
      my_user_id=xxxx
      account_ids="1234567890|1234567890"  #regex
      corp_words="xxxx|xxxx|xxxx|xxxx|xxxx"  #regex
      project_names="xxxx|xxxx|xxxx|xxxx|xxxx"  # regex
      pii="xxxx|xxxx|xxxx|xxxx|xxxx|xxxx"  # regex
      hostnames="xxxx|xxxx|xxxx|xxxx|xxxx|xxxx|xxxx|xxxx|xxxx"  # regex
      # anonymize IPs
      pbpaste | sed -E -e 's/([0-9]{1,3})\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}/\1.x.x.x/g' \
      -e "s/(${corp_words}|${project_names}|${my_user_id}|${pii}|${hostnames})/xxxx/g" -e "s/(${account_ids})/1234567890/g" | pbcopy
      pbpaste
    }

    alias anon=anonymizeclipboard
It prints the new clipboard to stdout so you can inspect what you'll be pasting for anything it missed.
  • xwowsersx 16 hours ago

    ha! alias vz="vim ~/.zshrc && . ~.zshrc" is brilliant. Editing zshrc and sourcing is something I do pretty often. Never thought to alias

  • 1718627440 6 hours ago

    What's the difference between 'source' and '.' ?

    • iguessthislldo an hour ago

      I think they're the same except '.' is POSIX and 'source' is specific to bash and compatible shells. I personally just use source since it's easier to read and zsh and bash account for basically 100% of my shell usage.

  • banku_brougham 17 hours ago

    brilliant! this happens all the time and I never found a convenient way to manage

andai 19 hours ago

I use these two all the time to encode and cut mp4s.

The flags are for maximum compatibility (e.g. without them, some MP4s don't play in WhatsApp, or Discord on mobile, or whatever.)

    ffmp4() {
        input_file="$1"
        output_file="${input_file%.*}_sd.mp4"

        ffmpeg -i "$input_file" -c:v libx264 -crf 33 -profile:v baseline -level 3.0 -pix_fmt yuv420p -movflags faststart "$output_file"

        echo "Compressed video saved as: $output_file"
    }
    
    
ffmp4 foo.webm

-> foo_sd.mp4

    fftime() {
        input_file="$1"
        output_file="${input_file%.*}_cut.mp4"
        ffmpeg -i "$input_file" -c copy -ss "$2" -to "$3" "$output_file"

        echo "Cut video saved as: $output_file"
    }

fftime foo.mp4 01:30 01:45

-> foo_cut.mp4

Note, fftime copies the audio and video data without re-encoding, which can be a little janky, but often works fine, and can be much (100x) faster on large files. To re-encode just remove "-c copy"

dannyobrien a day ago

Historical note: getting hold of these scripts by chatting to various developers was the motivation for the original 2004 "lifehacks" talk[1][2]. If you ever get into an online argument over what is a "life hack" and what isn't, feel free to use short scripts like these as the canonical example.

Otherwise, I am happy to be pulled into your discussion, Marshall McLuhan style[3] to adjudicate, for a very reasonable fee.

[1] https://craphound.com/lifehacksetcon04.txt

[2] https://archive.org/details/Notcon2004DannyOBrienLifehacks

[3] https://www.openculture.com/2017/05/woody-allen-gets-marshal...

abetusk 18 hours ago

I'm kicking myself for not thinking of the `nato` script.

I tend to try to not get too used to custom "helper" scripts because I become incapacitated when working in other systems. Nevertheless, I really appreciate all these scripts if nothing else than to see what patterns other programmers pick up.

My only addition is a small `tplate` script that creates HTML, C, C++, Makefile, etc. "template" files to start a project. Kind of like a "wizard setup". e.g.

  $ tplate c
  #include <stdio.h>
  #include <stdlib.h>
  int main(int argc, char **argv) {
  }
And of course, three scripts `:q`, `:w` and `:wq` that get used surprisingly often:

  $ cat :q
  #!/bin/bash
  echo "you're not in vim"
bemmu 4 hours ago

The one I use the most is "cdn". It cds to the newest subdirectory.

So if you're in your projects folder and want to keep working on your latest project, I just type "cdn" to go there.

alberand 20 hours ago

My fav script to unpack anything, found a few years ago somewhere

      # ex - archive extractor
      # usage: ex <file>
      function ex() {
          if [ -f $1 ] ; then
          case $1 in
              *.tar.bz2) tar xjf $1 ;;
              *.tar.gz) tar xzf $1 ;;
              *.tar.xz) tar xf $1 ;;
              *.bz2) bunzip2 $1 ;;
              *.rar) unrar x $1 ;;
              *.gz) gunzip $1 ;;
              *.tar) tar xf $1 ;;
              *.tbz2) tar xjf $1 ;;
              *.tgz) tar xzf $1 ;;
              *.zip) unzip $1 ;;
              *.Z) uncompress $1;;
              *.7z) 7z x $1 ;;
              *) echo "'$1' cannot be extracted via ex()" ;;
          esac
          else
              echo "'$1' is not a valid file"
          fi
      }
  • _whiteCaps_ 15 hours ago

    `tar xf` autodetects compressed files now. You can replace any of your instances of tar with that.

    • alberand 2 hours ago

      Honestly, it doesn't need any updates, it works so great without any pain, I'm just happy with it

    • soraminazuki 9 hours ago

      Yes, but only bsdtar has support for zip, rar, and 7z.

  • rbonvall 19 hours ago

    I use dtrx, which also ensures that all files are extracted into a folder.

  • juancroldan 20 hours ago

    That's brilliant. Now I need its compressing counterpart.

    • alberand 2 hours ago

      For compression, I have one for .tar.gz. But it's not that popular in my system. I need something a bit easier than 'pack file file file archive.tar.gz'

  • YouAreWRONGtoo 19 hours ago

    Now, add inotify and a systemd user service and you would be getting somewhere. Also packaged versions of that exist already.

    So, you created a square wheel, instead of a NASA wheel.

WA 11 hours ago

One script I use quite often:

    function unix() {
      if [ $# -gt 0 ]; then
        echo "Arg: $(date -r "$1")"
      fi
      echo "Now: $(date) - $(date +%s)"
    }
Prints the current date as UNIX timestamp. If you provide a UNIX timestamp as arg, it prints the arg as human readable date.
SoftTalker a day ago

Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.

sed, awk, grep, and xargs along with standard utilities get you a long long way.

  • scoodah 11 hours ago

    Same. I interact with too many machines, many of which are ephemeral and will have been reprovisioned the next time I have to interact with it.

    I value out of the box stuff that works most everywhere. I have a fairly lightweight zsh config I use locally but it’s mostly just stuff like a status like that suits me, better history settings, etc. Stuff I won’t miss if it’s not there.

  • pinkmuffinere a day ago

    I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.

linsomniac 3 hours ago

The Gen AI tooling is exceptionally good at doing these sorts of things, and way more than just "mkdir $1 && cd $1". For example:

I have used it to build an "escmd" tool for interacting with Elasticsearch. It makes the available commands much more discoverable, the output it formats in tables, and gets rid of sending JSON to a curl command.

A variety of small tools that interact with Jira (list my tickets, show tickets that are tagged as needing ops interaction in the current release).

A tool to interact with our docker registry to list available tags and to modify tags, including colorizing them based on the sha hash of the image so it's obvious which ones are the same. We manage docker container deploys based on tags so if we "cptag stg prod" on a project, that releases the staging artifact to production, but we also tag them by build date and git commit hash, so we're often working with 5-7 tags.

Script to send a "Software has successfully been released" message via gmail from the command-line.

A program to "waituntil" a certain time to run a command: "waituntil 20:00 && run_release", with nice display of a countdown.

I have a problem with working on too many things at once and then committing unrelated things tagged with a particular Jira case. So I had it write me a commit program that lists my tickets, shows the changed files, and lets me select which ones go with that ticket.

All these are things I could have built before, but would have taken me hours each. With the GenAI, they take 5-15 minutes of my attention to build something like this. And Gen AI seems really, really great at building these small, independent tools.

Noumenon72 19 hours ago

> ocr my_image.png extracts text from an image and prints it to stdout. It only works on macOS

The Mac Shortcut at https://github.com/e-kotov/macos-shortcuts lets you select a particular area of the screen (as with Cmd-Shift-4) and copies the text out of that, allowing you to copy exactly the text you need from anywhere on your screen with one keyboard shortcut. Great for popups with unselectable text, and copying error messages from coworkers' screenshares.

  • 0cf8612b2e1e 11 hours ago

    I have a Linux equivalent that uses maim to select a region and then tesseract to do the OCR.

o11c 21 hours ago

I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:

  #!/bin/sh
  if test "$#" != 2
  then
      echo 'Error: unmv must have exactly 2 arguments'
      exit 1
  fi
  exec mv "$2" "$1"
dcassett 15 hours ago

I find that I like working with the directory stack and having a shortened version of the directory stack in the title bar, e.g. by modifying the stock Debian .bashrc

  # If this is an xterm set the title to the directory stack
  case "$TERM" in
  xterm*|rxvt*)
      if [ -x ~/bin/shorten-ds.pl ]; then
    PS1="\[\e]0;\$(dirs -v | ~/bin/shorten-ds.pl)\a\]$PS1"
      else
    PS1="\[\e]0;${debian_chroot:+($debian_chroot)}\u@\h:   \w\a\]$PS1"
      fi
      ;;
  \*)
      ;;
  esac
The script shorten_ds.pl takes e.g.

  0  /var/log/apt
  1  ~/Downloads
  2  ~
and shortens it to:

  0:apt 1:Downloads 2:~

  #!/usr/bin/perl -w
  use strict;
  my @lines;
  while (<>) {
    chomp;
    s%^ (\d+)  %$1:%;
    s%:.*/([^/]+)$%:$1%;
    push @lines, $_
  }
  print join ' ', @lines;

That coupled with functions that take 'u 2' as shorthand for 'pushd +2' and 'o 2' for 'popd +2' make for easy manipulation of the directory stack:

  u() {
    if [[ $1 =~ ^[0-9]+$ ]]; then
      pushd "+$1"
    else
      pushd "$@"
    fi
  }

  o() {
    if [[ $1 =~ ^[0-9]+$ ]]; then
      popd "+$1"
    else
      popd "$@" # lazy way to cause an error
    fi
  }
yipbub a day ago

I have mkcd exactly ( I wonder how many of us do, it's so obvious)

I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.

I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.

I used to have one called timespeak that would speak the time to me every hour or half hour.

I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.

I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.

  • justusthane a day ago

    Obviously that script is more convenient, but if you’re on a system where you don’t have it, you can do the following instead:

        mkdir /some/dir    
        cd !$   
        (or cd <alt+.>)
  • mttpgn 12 hours ago

    I too have a `mkcd` in my .zshrc, but I implemented it slightly differently:

      function mkcd {
        newdir=$1
        mkdir -p $newdir
        cd $newdir
      }
  • linsomniac 14 hours ago

    >I have mkcd exactly ( I wonder how many of us do, it's so obvious)

    Mine is called "md" and it has "-p" on the mkdir. "mkdir -p $1 && cd $1"

  • taejavu 19 hours ago

    Doesn’t the built in `take` do exactly what `mkcd` does? Or is `take` a zsh/macos specific thing?

    Edit: looks like it’s a zsh thing

    • codesnik 15 hours ago

      it's an .oh-my-zsh thing (~/.oh-my-zsh/lib/functions.zsh) but thanks, I didn't know about it.

  • aib 8 hours ago

    One more from me:

      mkcd() {
        mkdir -p -- "$1" &&
        cd -- "$1"
      }
sdovan1 16 hours ago

I have three different way to open file with vim: v: vim (or neovim, in my case) vv: search/preview and open file by filename vvv: search/preview and open file by its content

    alias v='nvim'
    alias vv='f=$(fzf --preview-window "right:50%" --preview "bat --color=always {1}"); test -n "$f" && v "$f"'
    alias vvv='f=$(rg --line-number --no-heading . | fzf -d: -n 2.. --preview-window "right:50%:+{2}" --preview "bat --color=always --highlight-line {2} {1}"); test -n "$(echo "$f" | cut -d: -f1)" && v "+$(echo "$f" | cut -d: -f2)" "$(echo "$f" | cut -d: -f1)"'
arjie 17 hours ago

A few I use are:

    #!/usr/bin/env bash
    # ~/bin/,dehex

    echo "$1" | xxd -r -p

and

    #!/usr/bin/env bash
    # ~/bin/,ht

    highlight() {
      # Foreground:
      # 30:black, 31:red, 32:green, 33:yellow, 34:blue, 35:magenta, 36:cyan

      # Background:
      # 40:black, 41:red, 42:green, 43:yellow, 44:blue, 45:magenta, 46:cyan
      escape=$(printf '\033')
      sed "s,$2,${escape}[$1m&${escape}[0m,g"
    }

    if [[ $# == 1 ]]; then
      highlight 31 $1
    elif [[ $# == 2 ]]; then
      highlight 31 $1 | highlight 32 $2
    elif [[ $# == 3 ]]; then
      highlight 31 $1 | highlight 32 $2 | highlight 35 $3
    elif [[ $# == 4 ]]; then
      highlight 31 $1 | highlight 32 $2 | highlight 35 $3 | highlight 36 $4
    fi
I also use the comma-command pattern where I prefix my personal scripts with a `,` which allows me to cycle between them fast etc.

One thing I have found that's worth it is periodically running an aggregation on one's history and purging old ones that I don't use.

internet_points 9 hours ago

With `xsel --clipboard` (put that in an alias like `clip`), you can use the same thing to replace both `copy` and `pasta`:

    # High level examples
    run_some_command | clip
    clip > file_from_my_clipboard.txt
    
    # Copy a file's contents
    clip < file.txt

    # indent for markdown:
    $ clip|sed 's/^/    /'|clip
nberkman 20 hours ago

Nice! Tangentially related: I built a (MacOS only) tool called clippy to be a much better pbcopy. It was just added to homebrew core. Among other things, it auto-detects when you want files as references so they paste into GUI apps as uploads, not bytes.

  clippy image.png  # then paste into Slack, etc. as upload

  clippy -r         # copy most recent download

  pasty             # copy file in Finder, then paste actual file here
https://github.com/neilberkman/clippy / brew install clippy
  • Tempest1981 20 hours ago

    Adding the word "then" to your first comment would have helped me: (lacking context, I thought the comments explained what the command does, as is common convention)

      clippy image.png   # then paste into Slack, etc. as upload
    
    Also:

      pasty              # paste actual file, after copying file in Finder
    • nberkman 20 hours ago

      Updated, I appreciate it!

  • gigatexal 19 hours ago

    Awesome. Gonna check this out.

hughdbrown 2 hours ago

This is really interesting, but I need the highlights reel. So I need a script to summarize Hacker News pages and/or arbitrary web pages. Maybe that's what I want for getting the juice out of Medium articles.

jrm4 a day ago

Broadly, I very much love this approach to things and wish it was more "acceptable?" It reminds me of the opposite of things like "the useless use of cat" which to me is one of the WORST meme-type-things in this space.

Like, it's okay -- even good -- for the tools to bend to the user and not the other way around.

briansm 18 hours ago

Using 'copy' as a clipboard script tells me OP never lived through the DOS era I guess... Used to drive me mad switching between 'cp' in UNIX and 'copy' in DOS. (Same with the whole slash vs backslash mess.)

a_e_k 8 hours ago

I like the NATO one.

It occurred to me that it would be more useful to me in Emacs, and that might make a fun little exercise.

And that's how I discovered `M-x nato-region` was already a thing.

greenpizza13 2 hours ago

For tempe, recommend changing "cd" to "push" do you can "popd" as soon as you're done.

brainzap 3 hours ago

My most used automation copies a file with rclone to backblaze blob storage, and puts the link into the clipboard. (for sharing memes)

and alias debian="docker run -it --rm -v $(pwd):/mnt/host -w /mnt/host --name debug-debian debian"

alentred 21 hours ago

> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often (probably about once a month)

I genuinely wonder, why would anyone want to use this, often?

  • abetusk 3 hours ago

    As a programmer, you sometimes want to make an alphabet lookup table. So, something like:

      var alpha_lu = "abcdefghijklmnopqrstuvwxyz";
    
    Typing it out by hand is error prone as it's not easy to see if you've swapped the order or missed a character.

    I've needed the alphabet string or lookup rarely, but I have needed it before. Some applications could include making your own UUID function, making a small random naming scheme, associating small categorical numbers to letters, etc.

    The author of article mentioned they do web development, so it's not hard to imagine they've had to create a URL shortener, maybe more than once. So, for example, creating a small name could look like:

      function small_name(len) {
        let a = "abcdefghijklmnopqrstuvwxyz",
            v = [];
        for (let i=0; i<len; i++) {
          v.push( a[ Math.floor( Math.random()*a.length ) ] );
        }
        return v.join("");
      }
      //...
      small_name(5); // e.g. "pfsor"
    
    Dealing with strings, dealing with hashes, random names, etc., one could imagine needing to do functions like this, or functions that are adjacent to these types of tasks, at least once a month.

    Just a guess on my part though.

  • CGamesPlay 10 hours ago

    If your native language uses a different alphabet, you might not have been taught "the alphabet song". For example, I speak/read passable Russian, but could not alphabetize a list in Russian.

  • dcassett 16 hours ago

    For me it's when I call customer service or support on the phone, and either give them an account #, or confirm a temporary password that I have been verbally given.

    • Tempest1981 13 hours ago

      Are you referring to the nato alphabet utility? Or the alphabet script that prints

        abcdefghijklmnopqrstuvwxyz
        ABCDEFGHIJKLMNOPQRSTUVWXYZ
      • johntash 9 hours ago

        I imagine all of his passwords are abcdefghijklmnopqrstuvwxyz

revicon 21 hours ago

I have a bunch of little scripts and aliases I've written over the years, but none are used more than these...

alias ..='cd ..'

alias ...='cd ../..'

alias ....='cd ../../..'

alias .....='cd ../../../..'

alias ......='cd ../../../../..'

alias .......='cd ../../../../../..'

  • jcgl 2 hours ago

    In fish, I have an abbreviation that automatically expands double dots into ../ so that you can just spam double dots and visually see how far you're going.

      # Modified from
      # https://github.com/fish-shell/fish-shell/issues/1891#issuecomment-451961517
      function append-slash-to-double-dot -d 'expand .. to ../'
       # Get commandline up to cursor
       set -l cmd (commandline --cut-at-cursor)
      
       # Match last line
       switch $cmd[-1]
       case '*.'
        commandline --insert './'
       case '*'
        commandline --insert '.'
       end
      end
  • cosmos0072 20 hours ago

    I need this *so* often that I programmed my shell to execute 'cd ..' every time I press KP/ i.e. '/' on the keypad, without having to hit Return.

    Other single-key bindings I use often are:

    KP* executes 'ls'

    KP- executes 'cd -'

    KP+ executes 'make -j `nproc`'

  • Bishonen88 21 hours ago

    up() { local d="" for ((i=1; i<=$1; i++)); do d="../$d" done cd "$d" }

    up 2, up 3 etc.

  • vunderba 21 hours ago

    Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.

    • machomaster 20 hours ago

      Yes it does.

      • Noumenon72 20 hours ago

        Not on my Mac.

            zsh: permission denied: ..
            zsh: command not found: ...
  • pfg_ 18 hours ago

    fish lets you cd to a folder without 'cd' although you still need the slashes. I use it all the time.

        c $> pwd
        /a/b/c
        c $> dir1
        dir1 $> ..
        c $> ../..
        / $>
  • tacone 21 hours ago

    I have setup a shortcut: alt+. to run cd.., it's pretty cool.

    I also aliased - to run cd -

    • fragmede 19 hours ago

      but alt-. in bash is used for pasting the last argument to the previous command into the current one.

      • tacone 7 hours ago

        Good point, when working with keybindings, you'll inevitably end up overriding built-ins. I see it as a trade-off, between something I don't know of (and wouldn't use) and something I find useful. Works for me :)

        • fragmede 2 hours ago

          absolutely. From back in the day, the annoying one was GNU screen, which took over ctrl-a by default. Overrode that to be ctrl-^, which in bash is transpose, make "zx be "xz", which was rare enough to okay with losing.

lolive 8 hours ago

My most important script has been to remap CapsLock as a kind of custom Meta key, that transforms (when pressed) the Space into Return, hjkl into arrows, io into PgUp/PgDn, and 1-9 into function keys. Now I have a 60% keyboard that takes 0 space on my desk. And I am reaaaally happy with this setup.

[that, plus LinkHint plugin for Firefox, and i3 for WM is my way to go for a better life]

jwsteigerwalt 4 hours ago

17 years ago I wrote a short VBA macro that takes the high life’s range of cells, concatenates the values into a comma separated list, then opens the list in notepad for easy copy and further use. I can’t begin to count the number of executions by myself and those i have shared it with.

cool-RR 11 hours ago

The most useful script I wrote is one I call `posh`. It shorten a file path by using environment variables. Example:

  $ posh /home/ramrachum/Dropbox/notes.txt
  $DX/notes.txt
Of course, it only becomes useful when you define a bunch of environment variables for the paths that you use often.

I use this a lot in all of my scripts. Basically whenever any of my script prints a path, it passes it through `posh`.

  • oneeyedpigeon 7 hours ago

    I'd love to see this script. Does it use `env` and strip out things like PWD?

    • cool-RR 6 hours ago

      I wrote it in a way that's too intertwined with my other shit to be shareable with people, but honestly you can copy-paste my comment to your friendly neighborhood LLM and you'll get something decent. Indeed it uses `env`.

      • oneeyedpigeon 5 hours ago

        Understood. I'd rather write it myself from scratch than use an LLM; confirmation of the general process should be enough, I hope!

WhyNotHugo a day ago

> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.

You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.

spiffyk 9 hours ago

> url "$my_url" parses a URL into its parts. I use this about once a month to pull data out of a URL, often because I don’t want to click a nasty tracking link.

This sounds pretty useful!

Coincidentally, I have recently learned that Daniel Stenberg et al (of cURL fame) wrote trurl[1], a libcurl-based CLI tool for URL parsing. Its `--json` option seems to yield similar results as TFA's url, if slightly less concise because of the JSON encoding. The advantage is that recent releases of common Linux distros seem to include trurl in their repos[2].

[1]: https://curl.se/trurl/

[2]: https://pkgs.org/search/?q=trurl

internet_points 8 hours ago

on my ubuntu, `date -I` does iso dates

Also re: alphabet

    $ echo {a..z}
    a b c d e f g h i j k l m n o p q r s t u v w x y z
  • oneeyedpigeon 7 hours ago

    date -I even works on macOS, which I was pleasantly surprised by!

    If you want the exact alphabet behaviour as the OP:

        $ echo {a..z} $'\n' {A..Z} | tr -d ' '
cshores 30 minutes ago

I have a script called catfiles that I store in ~/.local/bin that recursively dumps every source file with an associated file header so I can paste the resulting blob in to Gemini and ChatGPT in order to have a conversation about the changes I would like to make before I send off the resulting prompt to Gemini Code Assist.

Heres my script if anyone is interested in as I find it to be incredibly useful.

find . -type f \( -name ".tf" -o -name ".tfvars" -o -name ".json" -o -name ".hcl" -o -name ".sh" -o -name ".tpl" -o -name ".yml" -o -name ".yaml" -o -name ".py" -o -name ".md" \) -exec sh -c 'for f; do echo "### FILE: $f ###"; cat "$f"; echo; done' sh {} +

Tempest1981 20 hours ago

> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often

I'm curious to hear some examples (feel like I'm missing out)

andai 19 hours ago

    alias mpa='mpv --no-video'

    mpa [youtube_url]
I use this to listen to music / lectures in the terminal.

I think it needs yt-dlp installed — and reasonably up to date, since YouTube keeps breaking yt-dlp... but the updates keep fixing it :)

  • andai 19 hours ago

    On the subject of yt-dlp, I use it to get (timestamped) transcripts from YouTube, to shove into LLMs for summaries.

        ytsub() {
            yt-dlp \
                --write-sub \
                --write-auto-sub \
                --sub-lang "en.*" \
                --skip-download \
                "$1" && vtt2txt
        }
    
        ytsub [youtube_url]
    
    Where vtt2txt is a python script — slightly too long to paste here — which strips out the subtitle formatting, leaving a (mostly) human readable transcript.
chamomeal 19 hours ago

I started writing way more utility scripts when I found babashka. Magic of clojure, instant startup, easy to shell out to any other command, tons of useful built in stuff, developing with the REPL. It’s just a good time!!

jimmySixDOF 6 hours ago

I had my hopes on this project RawDog using local smol sized LLMs but it hasn't been updated in a while. I feel like all this should be running easily in the background nowadays.

https://github.com/AbanteAI/rawdog

interestica 21 hours ago

Share yours!

I use this as a bookmarklet to grab the front page of the new york times (print edition). (You can also go back to any date up to like 2011)

I think they go out at like 4 am. So, day-of, note that it will fail if you're in that window before publishing.

    javascript:(()=>{let d=new Date(new Date().toLocaleString('en-US',{timeZone:'America/New_York'})),y=d.getFullYear(),m=('0'+(d.getMonth()+1)).slice(-2),g=('0'+d.getDate()).slice(-2);location.href=`https://static01.nyt.com/images/${y}/${m}/${g}/nytfrontpage/scan.pdf`})()
chasil 20 hours ago

I like this one.

  $ cat /usr/local/bin/awkmail
  #!/bin/gawk -f

  BEGIN { smtp="/inet/tcp/0/smtp.yourco.com/25";
  ORS="\r\n"; r=ARGV[1]; s=ARGV[2]; sbj=ARGV[3]; # /bin/awkmail to from subj < in

  print "helo " ENVIRON["HOSTNAME"]        |& smtp;  smtp |& getline j; print j
  print "mail from:" s                     |& smtp;  smtp |& getline j; print j
  if(match(r, ","))
  {
   split(r, z, ",")
   for(y in z) { print "rcpt to:" z[y]     |& smtp;  smtp |& getline j; print j }
  }
  else { print "rcpt to:" r                |& smtp;  smtp |& getline j; print j }
  print "data"                             |& smtp;  smtp |& getline j; print j

  print "From:" s                          |& smtp;  ARGV[2] = ""   # not a file
  print "To:" r                            |& smtp;  ARGV[1] = ""   # not a file
  if(length(sbj)) { print "Subject: " sbj  |& smtp;  ARGV[3] = "" } # not a file
  print ""                                 |& smtp

  while(getline > 0) print                 |& smtp

  print "."                                |& smtp;  smtp |& getline j; print j
  print "quit"                             |& smtp;  smtp |& getline j; print j

  close(smtp) } # /inet/protocol/local-port/remote-host/remote-port
0xbadcafebee 14 hours ago

The scripts from my junk drawer (https://github.com/peterwwillis/junkdrawer) I use every day are 'kd' and 'gw', which use the Unix dialog command to provide an easy terminal UI for Kubectl and Git Worktrees (respectively)... I probably save 15+ minutes a day just flitting around in those UIs. The rest of the scripts I use for random things; tasks in AWS/Git/etc I can never remember, Terraform module refactoring, Bitbucket/GitHub user management, Docker shortcuts, random password generation, mirroring websites with Wget, finding duplicate files, etc.

lillesvin 19 hours ago

Obviously, to each their own, but to me, this is an overwhelming amount of commands to remember on top of all the ones they are composed of that you will likely need to know anyway — regardless if all the custom ones exist.

Like, I'd have to remember both `prettypath` and `sed`, and given that there's hardly any chance I'll not need `sed` in other situations, I now need to remember two commands instead of one.

On top of that `prettypath` only does s/:/\\n/ on my path, not on other strings, making its use extremely narrow. But generally doing search and replace in a string is incredibly useful, so I'd personally rather just use `sed` directly and become more comfortable with it. (Or `perl`, but the point is the same.)

As I said, that's obviously just my opinion, if loads of custom scripts/commands works for you, all the more power to you!

teo_zero 11 hours ago

Please note that 'each' is fundamentally different from 'xargs'.

  echo 1 2 3 | each "rm {}"
is the same as

  rm 1
  rm 2
  rm 3
while

  echo 1 2 3 | xargs rm
is the same as

  rm 1 2 3
I would rather say that 'each' replaces (certain uses of) 'for':

  for i in 1 2 3; do rm $i; done
  • jgtrosh 11 hours ago

    It's equivalent to xargs -I {} rm {}

           -I replace-str
                  Replace occurrences of replace-str in the initial-arguments
                  with names read from standard input.  Also, unquoted blanks
                  do not terminate input items; instead the separator is the
                  newline character.  Implies -x and -L 1.
sedatk 21 hours ago

> `rn` prints the current time and date using date and cal.

And you can type `rn -rf *` to see all timezones recursively. :)

botverse 8 hours ago

I did something similar with copy until I found this which works across remote terminals too:

`alias clip="base64 | xargs -0 printf '\e]52;c;%s\007'"`

It just sends it to the client’s terminal clipboard.

`cat thing.txt | clip`

apricot13 7 hours ago

and here's me still ctrl+r-ing for my commonly used methods

  • dbalatero an hour ago

    hopefully with fzf and not with the built in ctrl r

sitebolts 7 hours ago

Here are some snippets that we've compiled over time:

https://snhps.com

They're not all necessarily the most efficient/proper way to accomplish a task, but they're nice to have on hand and be able to quickly share.

Admittedly, their usefulness has been diminished a bit since the rise of LLMs, but they still come in handy from time to time.

hiq 18 hours ago

I've started using snippets for code reviews, where I find myself making the same comments (for different colleagues) regularly. I have a keyboard shortcut opening a fuzzy search to find the entry in a single text file. That saves a lot of time.

As an aside, I find most of these commands very long. I tend to use very short aliases, ideally 2 characters. I'm assuming the author uses tab most of the time, if the prefixes don't overlap beyond 3 characters it's not that bad, and maybe the history is more readable.

liqilin1567 9 hours ago

One of my biggest headaches is stripping specific number of bytes from the head or tail of a binary file. and I couldn't find any built-in tool for that, so I wrote one in C++.

  • rkeene2 9 hours ago

    Last X bytes: dd bs=1 skip=X

    First X bytes: dd bs=X count=1

    • liqilin1567 8 hours ago

      Thanks, there were few errors after testing.

      1. stripping fist X bytes: dd bs=1 skip=X

      2. stripping last X bytes: truncate -s -X

internet_points 9 hours ago

`line 10` can be written as `sed -n 10p` (instead of head+tail)

panki27 7 hours ago

My most used function is probably the one I use to find the most recent files:

    lt () { ls --color=always -lt ${1} | head }
some_guy_nobel 20 hours ago

These are great, and I have a few matching myself.

Here are some super simple ones I didn't see that I use almost every day:

cl="clear"

g="git"

h="history"

ll="ls -al"

path='echo -e ${PATH//:/\\n}'

lv="live-server"

And for common navigation:

dl="cd ~/Downloads"

dt="cd ~/Desktop"

  • hackeraccount 11 minutes ago

    I'm terrible about remembering shortcuts (edit a bash line in an editor? Can never remember it) but clear (CTRL-l) is one that really stuck.

    That and exit (CTRL-d). A guy I used to work with just mentioned it casually and someone it just seared itself into my brain.

amterp 19 hours ago

Love this, lots of great ideas I'll be stealing :)

Folks interested in scripting like this might like this tool I'm working on https://github.com/amterp/rad

Rad is built specifically for writing CLI scripts and is perfect for these sorts of small to medium scripts, takes a declarative approach to script arguments, and has first-class shell command integration. I basically don't write scripts in anything else anymore.

haskellshill 18 hours ago

> `nato bar` returns Bravo Alfa Romeo. I use this most often when talking to customer service and need to read out a long alphanumeric string, which has only happened a couple of times in my whole life. But it’s sometimes useful!

Even more useful is just learning the ICAO Spelling Alphabet (aka NATO Phonetic Alphabet, of which it is neither). It takes like an afternoon and is useful in many situations, even if the receiver does not know it.

  • shellfishgene 8 hours ago

    Some time ago I tried to tell my email address to someone in Japan over the phone who did not speak English very well. It turned out to be basically impossible. I realized later one could probably come up with a phonetic alphabet of English words most Japanese know!

sorenjan 19 hours ago

I got a ccurl python script that extracts the cookies from my Firefox profile and then passes those on to curl, that way I can get webpages where I'm logged in.

rcarmo 10 hours ago

As a fun game, I suggest feeding the entire piece to an LLM and asking it to create those scripts. The differences between Claude, GOT-5 and Gemini are very interesting.

helicaltwine a day ago

As a bonus, I prepend my custom aliases or scripts with my user name and hyphen (i.e helicaltwine-). It helps me recall rarely used scripts when I need them and forget the names.

  • dunb 20 hours ago

    I follow a similar but more terse pattern. I prepend them all with a comma, and I have yet to find any collisions. If you're using bash (and I assume posix sh as well), the comma character has no special meaning, so this is quite a nice use for it. I agree that it's nice to type ",<tab>" and see all my custom scripts appear.

javier123454321 21 hours ago

This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.

vunderba a day ago

Nice. I have a bash script similar to the one listed "removeexif" called prep_for_web which takes any image file (PNG, BMP, JPG, WebP), scrubs EXIF data, checks for transparency and then compresses it to either JPG using MozJPEG or to PNG using PNGQuant.

[1] https://github.com/mozilla/mozjpeg

[2] https://pngquant.org

xiphias2 21 hours ago

An important advantage of aliases was not mentioned: I see everything in one place and can easily build aliases on top of other aliases without much thinking.

Anyways, my favourite alias that I use all the time is this:

    alias a='nvim ~/.zshrc && . ~/.zshrc'
It solves the ,,not loaded automatically'' part at least for the current terminal
pmontra a day ago

I also have a radio script to play internet streams with mpv (?). Other random stuff

A password or token generator, simple or complicated random text.

Scripts to list, view and delete mail messages inside POP3 servers

n, to start Nautilus from terminal in the current directory.

lastpdf, to open the last file I printed as PDF.

lastdownload, to view the names of the n most recent files in the Downloads directory.

And many more but those are the ones that I use often and I remember without looking at ~/bin

PUSH_AX 10 hours ago

Mkdir then cd into it, I just use ‘take’? Maybe this isn’t available by default everywhere?

encom 2 hours ago

Interesting, but none of the links are working... codeberg.org isn't responding, it just spins forever.

zeckalpha 15 hours ago

A couple more standard approaches with fewer chars:

jsonformat -> jq

running -> pgrep

senderista 14 hours ago

fish abbreviations >> bash aliases

ufko_org 12 hours ago

Thank you, I also stopped using aliases and have everything as scripts in my ~/bin

progforlyfe 3 hours ago

absolutely love these time savers!!

sid- 20 hours ago

Why dont we have mkcd in linux natively boggles my mind :)

  • marcuskaz 14 hours ago

    Likewise, why doesn't git clone automatically cd into the repo?

    • andriamanitra 3 hours ago

      A subprocess (git) can't modify the working directory of the parent process (the shell). This is a common annoyance with file managers like yazi and ranger as well—you need an extra (usually manual!) installation step to add a shell integration for whichever shell you're using so the shell itself can change directory.

      The best solution for automatically cd'ing into the repo is to wrap git clone in a shell function or alias. Unfortunately I don't think there's any way to make git clone print the path a repository was cloned to, so I had to do some hacky string processing that tries to handle the most common usage (ignore the "gh:" in the URL regex, my git config just expands it to "git@github.com:"):

      https://github.com/Andriamanitra/dotfiles/blob/d1aecb8c37f09...

  • taejavu 19 hours ago

    zsh has a `take` utility that is exactly this

CrimpCity 20 hours ago

Lately I’ve been using caffeinate to run long running scripts without interruption from sleep on Mac. Nothing crazy but could be useful to newer devs.

hshdhdhehd 12 hours ago

These arent bad but much better if they were all flags to the cat command.

E.g. cat --copy

headgasket a day ago

if you use x0vnc (useful if you use a linux machine both from the attached screen and from vnc, and in a bunch of other scenarios), copy and paste to and fro the vnc client is not implemented, quite frustrating. here's 2 scripts that does that for you, I now use this all day. https://github.com/francoisp/clipshare

czertyaka 13 hours ago

Some of these, especially text processing, is already built-in Nushell.

wiether a day ago

It's been a while since I haven't read something as useful!

There also some very niche stuff that I won't use but found funny

  • giraffe_lady a day ago

    The nato phonetic alphabet one cracked me up. My dude you don't need that, call center employees don't know it, just say S as in Sugar like ur grandma used to.

    • WCSTombs 20 hours ago

      The NATO alphabet is made of commonly known words that are hard to misspell and hard to mishear (at least the initial phonemes). The person on the other end doesn't need to be able to recite the words, they just need to be able to hear "november" and recognize that it starts with N.

      • giraffe_lady 16 hours ago

        Yes thank you that's a good description of what a phonetic alphabet is and how it's used.

    • ericyd a day ago

      The nato phonetic alphabet is still useful even if the other party doesn't know it, I've used it a bunch of times on the phone to spell out my 10- letter last name. Saves quite a lot of time and energy for me vs saying "letter as in word" for each letter.

      • vunderba 21 hours ago

        Exactly. The listening party doesn't need to have knowledge of the NATO alphabet to still benefit from it since they are just regular English words.

        I once had someone sound out a serial number over a spotty phone connection years ago and they said "N as in NAIL". You know what sounds a lot like NAIL? MAIL.

        And that is why we don't just arbitrarily make up phonetic alphabets.

      • SoftTalker a day ago

        > saying "letter as in word" for each letter

        Which often just confuses things further.

        Me: My name is "Farb" F-A-R-B. B as in Baker.

        Them: Farb-Baker, got it.

      • giraffe_lady a day ago

        Right but it's not much more useful than any other phonetic alphabet the other party doesn't know, including the one you make up on the spot.

        • sfink 20 hours ago

          If you're me, it's still useful because the ones I make up on the spot aren't great.

          "S-T-E-V-E @ gmail.com, S as in sun, T as in taste, ..." "Got it, fpeve."

        • dragonwriter 20 hours ago

          I dunno, there's a pretty good chance that the one that people spent time and effort designing to replace earlier efforts with the goal of reducing potential ambiguity and for use over noisy connections with expectation that mistakes could cost lives is probably better than what you improvise on the spot

    • kelvinjps10 21 hours ago

      When I worked in customer service, I asked a teammate what I could do to spell back something the customer said, and she taught me that system, it helped me a lot.

    • senkora a day ago

      I once had the customer service agent for Iberia (the Spanish airline) confirm my confirmation number with me using it.

      It worked with me and I guess it must have usually worked for him in most of his customer interactions.

    • dcassett 16 hours ago

      I've found the NATO alphabet fairly common at call centers, with globalization being a factor.

fragmede a day ago

Where are the one letter aliases? My life got better after I alias k=kubectl

thibran 20 hours ago

30% of the productivity hacks can be archived in vanilla Nushell.

yegle a day ago

The markdownquote can be replaced by (at least in vim):

^ (jump to the beginning)

ctrl+v (block selection)

j (move cursor down)

shift+i (bulk insert?)

type ><space>

ESC

blackhaj7 17 hours ago

Love this. Gunna use plenty of these

merksoftworks 11 hours ago

in oh-my-zsh you can use `take` to do what mkcd does.

desireco42 16 hours ago

I had youtube and serveit and some others, but pasta is really good, thanks!

  • janpmz 10 hours ago

    Last month I saw a tweet how to serve files using

    python3 -m http.server 1337

    Then I turned it into an alias, called it "serveit" and tweeted about it. And now I see it as a bash script, made a little bit more robust in case python is not installed :)

kwar13 11 hours ago

that was beautiful to read. command line ftw!

ttflee 13 hours ago

`perldoc perlrun`

banku_brougham 19 hours ago

this is really great. at some point i gavenup on being more efficient on the terminal, but many pain points are solved by your work

munchlax 21 hours ago

mksh is already the MirBSD Korn SHell

  • rauli_ 4 hours ago

    Which very very few people have actually installed on their system.

exasperaited 18 hours ago

The "scripts" I use the most that I am most happy with are a set of Vagrant tools that manage initialising different kinds of application environments with an apt cache on the host. Also .ssh/config includes to make it as easy as possible to work with them from VSCode.

I set this stuff up so long ago I sort of forgot that I did it at all; it's like a standard feature. I have to remember I did it.

naikrovek 19 hours ago

> wifi toggle

this fella doesn't know what "toggle" means. in this context, it means "turn off if it's currently on, or turn on if it's currently off."

this should be named `wifi cycle` instead. "cycle" is a good word for turning something off then on again.

naming things is hard, but it's not so hard that you can't use the right word. :)

  • codesnik 15 hours ago

    or wifi toggle-toggle!

SuperHeavy256 20 hours ago

I hope to see an operating system with these scripts as built-in, because they are so intuitive and helpful! Which OS will be the first to take this on?

samtrack2019 20 hours ago

no offense but a lot of those script are pretty hacky they may work for the user but i would not use them without making sure to review them and adapt them to my workflow

  • ziotom78 9 hours ago

    That's a fair point. I think the author intended the post to be a treasure trove of ideas for your own scripts, not as something to blindly include in your daily workflow.