Comment by vidarh
3 days ago
I'd replace the first part with (which isn't any shorter, but in general if I want a list of files for a pipeline, find is usually more flexible than ls for anything but the most trivial):
find -maxdepth 1 -type f -printf '%s %f\n' | sort -n | head -n 5
For the latter part, I'd tend to think that if you're going to use awk and jq, you might as well use Ruby.
ruby -rjson -nae ' puts(JSON.pretty_generate({n: $F[1], s: "%.5f MB" % ($F[0].to_i / 10e6) }))'
("-nae" effectively takes an expression on the command line (-e), wraps it in "while gets; ... end" (-a), and adds the equivalent to "$F = $_.split" before the first line of your expression (-n))
It's still ugly, so no competition for nushell still.
I'd be inclined to drop a little wrapper in my bin with a few lines of helpers (see my other comment) and do all Ruy if I wanted to get closer without having to change shells...
Ruby is a pretty natural fit for shell scripting.
https://lucasoshiro.github.io/posts-en/2024-06-17-ruby-shell...
It's close, but there are some things that could be better to make it easier to access e.g. file size and type. I think maybe a 50-100 line set of helpers and a one line wrapper (to spawn Ruby with -r<helper> -e <command line>) would get you mostly to where nushell is.