The "Dependency Cutout" Workflow Pattern

6 hours ago (blog.glyph.im)

I am using Git submodules for dependencies. My approach to a situation like this:

    1: Clone gh.com/someone/LibBar to gh.com/me/LibBar
    2: Fix the bug
    3: Send pull request to someone
    5: git submodule set-url lib/LibBar https://gh.com/me/LibBar.git
       git submodule sync lib/LibBar
       git submodule update --init --remote --recursive lib/LibBar
       cd lib/LibBar/; git checkout main; cd ../..
       git add .; git commit -m "Use my own version of lib/LibBar"

And keep using my fork until upstream accepted my pull request. Then I switch the url of the dependency back.

For me it usually looks like:

    - Fork LibBar on Github, apply patch
     - In docker stage build-bar:
        - RUN git clone gh.com/my_org/bar
        - WORKDIR bar/build
        - RUN cmake .. -DCMAKE_BUILD_TYPE=Release (old habits die hard, need to remember to use -B)
        - RUN make -j
     - in main docker stage: --mount from=bar,src=bar/build,target=/tmp/bar make install

...and that's it. A little bit of wastage if I had to install over the top of some apt install but oh well. If I contributed back to upstream, or they fix independently, gravy - remove the extra build. This doesn't work as well for python or js, of course, and doesn't "solve" pulling in upstream regularly, but I usually don't need to. Benefits of not usually working on internet facing software, I suppose.

The python case is thankfully usually not terrible nowadays (clone, `uv build`, push to artifact registry), modulo needing to pass whatever flag to uv to let it take into account both my private registry and public pypi at the same time.