name: "Upload podcast files to storage that don't yet exist" uses: with:Īzcliversion: 2.20.0 inlineScript: | az storage blob upload-batch -account-name cloudwithchrisprod -d 'podcasts' -s $ -if-unmodified-since T00:00ZĬhris, that looks to do the job. PODCAST_AUDIO_LOCATION: $GITHUB_WORKSPACE/podcast_audio steps: Name: production.azure url: runs-on: ubuntu-latest env: In your local Git repository, use the command below: In other words, for each repository where you’d like to use Git LFS. But how do you ensure that your repository is actually using Git LFS? These next steps will need to be completed on a per-repository basis. Great, now at this point - Git LFS is installed on your local machine. This is a one-time step that is needed for your local user account. Once installed, you can go ahead and run the command git lfs install.
#Install git lfs on remote machine download
exe file, for my Ubuntu WSL environment I had to add a new repository and use apt-get to download the binaries). First off, I downloaded the extension shown on the Git LFS site and followed the appropriate installation steps (e.g. Okay dear, user - how do you get set up with Git LFS? The Git LFS site I linked to above explains this really well, but I’ll explain this in the context of my use case a little further. What’s the remote location? The binary file that you originally wanted to version! (By the way, if you preferred something more visual - I have a brief 5 minute intro available here!) Rather than storing the binary file directly in the Git repository, it stores a text file containing a pointer to a remote location. How does it work? It’s actually quite simple. Git Large File Storage (LFS) is an open source extension to Git which focuses on the versioning of large files. That requires more bandwidth, more time to download, and ultimately takes up more space on the end-user’s machine, when that’s likely not needed.Įnter Git LFS. So if I wanted to contribute to a repository from my local machine and there are tens or hundreds of podcast audio files, then I have to download all of those just to make my change. As a new user cloning the repository, I have to bring down every single binary that has been uploaded. There are mixed opinions out there on storing large binary files directly within a Git repository. And, there is - I’ve implemented it! This blog post will walk you through why I’ve made these changes, how I made them and what the result is. It wasn’t a significant overhead, but I kept thinking that there must be a better way to do this. However, throughout that time I have been manually uploading the podcast files to my storage account. This then makes the episodes appear in podcast services such as Apple Podcasts, Google Podcasts and Spotify.
#Install git lfs on remote machine update
Through Hugo, these content updates automatically update the RSS feeds. For some time, I’ve been using GitHub actions to update the content of my site (i.e.