136 private links
The xpipe command reads input from stdin and splits it by the given number of bytes, lines, or if matching the given pattern. It then invokes the given utility repeatedly, feeding it the generated data chunks as input.
You can think of it as a Unix love-child of the split(1), tee(1), and xargs(1) commands.
It's usefulness might best be illustrated by an example. Suppose you have a file 'certs.pem' containing a number of x509 certificates in PEM format, and you wish to extract e.g., the subject and validity dates from each.
The openssl s_client(1) utility can only accept a single certificate at a time, so you'll have to first split the input into individual files containing exactly one cert, then repeatedly run the s_client(1) command against each file.
And, let's be honest, you probably have to google how to use sed(1) or awk(1) to extract subsequent blocks from a flip-flop pattern.
xpipe(1) can do the job for you in a single command:
Resource monitor that shows usage and stats for processor, memory, disks, network and processes.
Some commenters requested that we use our restored vintage 1930 Model 15 Teletype as a terminal for Linux. Hooking up a 5-bit Baudot mechanical contraption to a modern OS, even one that is terminal friendly, is not without some challenges: adapting to the non-standard high voltage 60 mA current loop, interfacing ASCII to the much smaller and different Baudot encoding, working in all caps, dealing with Baudot FIGS and LTRS modes, and making sure the computer doesn't overrun the pokey 45.5 bauds connection. But hey, Unix was developed on (much more modern 8-bit) teletypes, so that should still work, shouldn't it?
The first of McIlroy's dicta is often paraphrased as "do one thing and do it well", which is shortened from "Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new 'features.'"
McIlroy's example of this dictum is:
Surprising to outsiders is the fact that UNIX compilers produce no listings: printing can be done better and more flexibly by a separate program.
If you open up a manpage for ls on mac, you’ll see that it starts with
ls [-ABCFGHLOPRSTUW@abcdefghiklmnopqrstuwx1] [file ...]
That is, the one-letter flags to ls include every lowercase letter except for {jvyz}, 14 uppercase letters, plus @ and 1. That’s 22 + 14 + 2 = 38 single-character options alone.
One of the things that makes the shell an invaluable tool is the amount of available text processing commands, and the ability to easily pipe them into each other to build complex text processing workflows. These commands can make it trivial to perform text and data analysis, convert data between different formats, filter lines, etc.
When working with text data, the philosophy is to break any complex problem you have into a set of smaller ones, and to solve each of them with a specialized tool.
Once in a while a new program really surprises me. Reminiscing a while
ago, I came up with a list of eye-opening Unix gems. Only a couple of
these programs are indispensable or much used. What singles them out is
their originality. I cannot imagine myself inventing any of them.
Meld is a visual diff tool that makes it easier to compare and merge changes in files, directories, Git repos, and more.
A tool for exploring a docker image, layer contents, and discovering ways to shrink the size of your Docker/OCI image