← Back to context

Comment by Storment33

16 hours ago

> Huh? Who cares if the script is .sh, .bash, Makefile, Justfile, .py, .js or even .php?

Me, typically I have found it to be a sign of over-engineering and found no benefits over just using shell script/task runner, as all it should be is plumbing that should be simple enough that a task runner can handle it.

> If it works it works, as long as you can run it locally, it'll be good enough,

Maybe when it is your own personal project "If it works it works" is fine. But when you come to corporate environment there starts to be issues of readability, maintainability, proprietary tooling, additional dependencies etc I have found when people start to over-engineer and use programming languages(like Python).

E.g.

> never_inline 30 minutes ago | parent | prev | next [–]

> Build a CLI in python or whatever which does the same thing as CI, every CI stage should just call its subcommands.

However,

> and sometimes it's an even better idea to keep it in the same language the rest of the project is

I'll agree. Depending on the project's language etc other options might make sense. But personally so far everytime I have come across something not using a task runner it has just been the wrong decision.

> But personally so far everytime I have come across something not using a task runner it has just been the wrong decision.

Yeah, tends to happen a lot when you hold strong opinions with strong conviction :) Not that it's wrong or anything, but it's highly subjective in the end.

Typically I see larger issues being created from "under-engineering" and just rushing with the first idea people can think of when they implement things, rather than "over-engineering" causing similarly sized future issues. But then I also know everyone's history is vastly different, my views are surely shaped by the specific issues I've witnessed (and sometimes contributed to :| ), than anything else.

  • > Yeah, tends to happen a lot when you hold strong opinions with strong conviction :) Not that it's wrong or anything, but it's highly subjective in the end.

    Strong opinions, loosely held :)

    > Typically I see larger issues being created from "under-engineering" and just rushing with the first idea people can think of when they implement things, rather than "over-engineering"

    Funnily enough running with the first idea I think is creating a lot of the "over-engineering" I am seeing. Not stopping to consider other simpler solutions or even if the problem needs/is worth solving in the first place.

    > Yeah, tends to happen a lot when you hold strong opinions with strong conviction :) Not that it's wrong or anything, but it's highly subjective in the end.

    I quickly asked Claude to convert one of my open source repos using Make/Nix/Shell -> Python/Nix to see how it would look. It is actually one of the better Python as a task runners I have seen.

    * https://github.com/DeveloperC286/clean_git_history/pull/431

    While the Python version is not as bad as I have seen previously, I am still struggling to see why you'd want it over Make/Shell.

    It introduces more dependencies(Python which I solved via Nix) but others haven't solved this problem and the Python script has dependencies(such as Click for the CLI).

    It is less maintainable as it is more code, roughly x3 the amount of the Makefile.

    To me the Python code is more verbose and not as simple compared to the Makefile's target so it is less readable as well.

    • > It introduces more dependencies(Python which I solved via Nix) but others haven't solved this problem and the Python script has dependencies(such as Click for the CLI).

      UV scripts are great for this type of workflow

      There are even scripts which will install uv in the same file effectively making it just equivalent to ./run-file.py and it would handle all the dependency management the python version management and everything included and would work everywhere

      https://paulw.tokyo/standalone-python-script-with-uv/

      Personally I end up just downloading uv and so not using the uv download script from this but if I am using something like github action which are more (ephemeral?) I'd just do this.

      Something like this can start out simple and can scale much more than the limitations of bash which can be abundant at times

      That being said, I still make some shell scripts because executing other applications is first class support in bash but not so much in python but after discovering this I might create some new scripts with python with automated uv because I end up installing uv on many devices anyway (because uv's really good for python)

      I am interested in bun-shell as well but that feels way too much bloated and even not used by many so less (AI assistance at times?) and I haven't understood bun shell at the same time too and so bash is superior to it usually

      8 replies →

Using shell becomes deeply miserable as soon as you encounter its kryptonite, the space character. Especially but not limited to filenames.

I find that shell scripting has a sharp cliff. I agree with the sentiment that most things are over engineered. However it’s really easy to go from a simple shell script running a few commands to something significantly more complex just to do something seemingly simple, like parse a semantic version, make an api call and check the status code etc, etc.

The other problem with shell scripting on things like GHA is that it’s really easy to introduce security vulnerabilities by e.g forgetting to quote your variables and letting an uncontrolled input through.

There’s no middle ground between bash and python and a lot of functionality lives in that space.

  • > However it’s really easy to go from a simple shell script running a few commands to something significantly more complex just to do something seemingly simple, like parse a semantic version, make an api call and check the status code etc, etc.

    Maybe I keep making the wrong assumption that everyone is using the same tools the same way and thats why my opinions seem very strong. But I wouldn't even think of trying to "parse a semantic version" in shell, I am treating the shell scripts and task runners as plumbing, I would be handing that of a dedicated tool to action.

    • Let’s say have a folder of tarballs and need to install the latest version. I could reach for an additional “dedicated” tool, get it installed into the CI environment and then incorporate it into the build process, or I could just make a slight modification to my existing shell script and do something like “ls mypkg-*.tar.gz | xargs -n1 | sort -Vr | head -n1” and then move on. But then we start publishing multiple release candidates and now I need to add to that logic to further distinguish between earlier and later rc versions. And so on and so forth…

      Now I’m in agreement with you that this is a bad fit for shell scripting, but it is often pragmatic and expedient. And bc there is a cliff between bash and (say) python, some of the time, you’re going to choose the path of least resistance.

      Now scale this out to a small team of engineers all facing the same dumb decision of needing to make some tradeoff when they would much rather be writing application logic. The lack of a ubiquitous and robust intermediate language leads to brittle CI fraught with security vulnerabilities.

      While the example I provided is a bit contrived, this behavior isn’t hypothetical. I see it everywhere I’ve worked.

yea imagine having to maintain a python dependency (which undergoes security constraints) all because some junior cant read/write bash... and then that junior telling you you're the problem lmao