GitHub’s Large File Storage is no panacea for Open Source — quite the opposite
Stéphane Peter
1507

Thanks for this exellent write up Stéphane!

With the caveat that I haven’t tried it, it looks like you can successfully push to a fork if you run git lfs fetchwith --all which will trigger downloading of GitHub LFS objects into .git/lfs/objects.

$ git lfs fetch --all <remote>

As you say though, this is not how it’s advertised on GitHub and is a gotcha.

Additionally, you can’t push new objects GitHub repositories — so if someone wants to update a binary in a fork and push that as a part of their change they can’t do that.[1]

I agree they are not clear on how it changes the workflow and that — while the pricing is cheap at $5 for 50 GB — the pricing model is likely to put people off using it on public repos.

I’m considering GitHub LFS for my projects which use video and audio files in their tests, but I’m concerned about it introducing complexity and about long term support — i.e. are GitHub going to support this feature down the line.

At the moment I’m leaning towards hosting the binary files needed on something like AWS S3, and having hooks to download/upload binaries when a repo is checked out (and reading in credentials via environment variables), as it’s only a handful of lines of code to do and it won’t break on me in future.

[1] Apparently with GitHub Enterprise you can allow people to push to objects to your repos, but it’s $250 per user per year, with a minimum of 10 users ($2500).

Like what you read? Give Iain Collins a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.