← Back to context

Comment by embedding-shape

8 hours ago

Maybe I'm too old school, but both GitHub and Codeberg for me are asyncronous "I want to send/share the code somehow", not "my active workspace I require to do work". But reading

> the worst thing ever for me as a developer is having the urge to code and not being able to access my remote.

Makes it seem like GitHub/Codeberg has to be online for you to be able to code, is that really the case? If so, how does that happen, you only edit code directly in the GitHub web UI or how does one end up in that situation?

For me it's a soft block rather than a hard block. I use multiple computers so when I switch to the other one I usually do a git pull, and after every commit I do a push. If that gets interrupted, then I have resort to things like rsyncing over from the other system, but more than once I've lost work that way. I'm strongly considering just standing up a VM and using "just git" and foregoing any UI, but I make use of other features like CI/CD and Releases for distribution, so the VM strategy is still just a bandaid. When the remote is unavailable, it can be very disruptive.

  • > If that gets interrupted, then I have resort to things like rsyncing over from the other system

    I'm guessing you have SSH access between the two? You could just add it as another remote, via SSH, so you can push/pull directly between the two. This is what I do on my home network to sync configs and other things between various machines and OSes, just do `git remote add other-host git+ssh://user@10.55/~/the-repo-path` or whatever, and you can use it as any remote :)

    Bonus tip: you can use local paths as git remote URLs too!

    > but more than once I've lost work that way.

    Huh, how? If you didn't push it earlier, you could just push it later? Some goes for pull? I don't understand how you could lose anything tracked in git, corruption or what happened?

    • Usually one of two things, mostly the latter: I forget to exclude all the .git/ directory from the sync, or I have in-progress and nowhere near ready for commit changes on both hosts, and I forget and sync before I check. These are all PEBKAC problems and/or workflow problems, but on a typical day I'll be working in or around a half-dozen repos and it's too easy to forget. The normal git workflow protects from that because uncommitted changes in one can just be rebased easily the next time I'm working in that on any given computer. I've been doing it like this for nearly 20 years and it's never been an issue because remotes were always quite stable/reliable. I really just need to change my worfklow for the new reality, but old habits die hard.

  • > just standing up a VM and using "just git"

    That's what I do. Control your entire world yourself.

  • If you can rsync from the other system, and likely have an SSH connection between them, why don't you just add it as an additional remote and git pull from it directly?

For some projects, the issue tracker is a pretty integral part of the documentation. Sure, you can host your own issue tracker somewhere, but that's still shifting a center point somewhere, in a theoretically decentralized system. I've frequently wished the issue tracker was part of the repository. Also -- love them or hate them -- LLMs would probably love that too.

My main exposure to Codeberg is Zig and it has an issue tracker there and I pull in changes from it.

For how infrequent I interface with Codeberg I have to say that my experience has been pretty terrible when it comes to availability.

So I guess the answer is: the availability is bad enough that even infrequent interactions with it are a problem.

> Makes it seem like GitHub/Codeberg has to be online for you to be able to code, is that really the case?

I can understand that work with other active contributors, but I agree with you that it is a daft state of affairs for a solo or mostly-solo project.

Though if you have your repo online even away from the big places, it will get hit by the scrapers and you will end up with admin to do because of that, even if it doesn't block your normal workflow because your main remote is not public.

You’re right this is the proper way to use git. And I encourage developers to use their own cloud storage (or remote volume) for their primary remote.

Even with the best habits, there will be the few times a month where you forgot to push everything up and you’re blocked from work.

Codeberg needs to meet the highest ability levels for it to be viable.

I was shaking my head in disbelief when reading that part too. I mean, git's whole raison d'etre, back when it was introduced, was that you do not need online access to the repo server most of the time.

  • It's getting even worse if you read the thread about Claude going down the other day. People were having mini panic attacks.

  • > I mean, git's whole raison d'etre, back when it was introduced, was that you do not need online access to the repo server most of the time.

    So what ? That's not how most people prefer to use it.

  • > git's whole raison d'etre […] was that you do not need online access to the repo server most of the time

    Not really. The point of git was to make Linus' job of collating, reviewing, and merging, work from a disparate team of teams much less arduous. It just happens that many of the patterns needed for that also mean making remote temporarily disconnected remote repositories work well.

    • The whole point of git was tm be a replacement for BitKeeper after the Linux developers got banned from it for "hacking" after Andrew Tridgell connected to the server over telnet and typed "HELP"

      1 reply →