Source Control Trials and Tribulations
I have spent far too much time on this. I have been trying to get a workflow I was happy with for my personal game development efforts. But ultimately I’m now happy and I wanted to document my conclusions in case it helps anyone else.
TL:DR - Go do what Steve Streeting did
I had a number of requirements for any Source Control solution I used for personal use.
- Redundant Backups
- Minimal Admin
- Code Reviews
It needs to meet a minimum bar for “unlikely to be hacked or disrupted”. I’m less concerned about the source being made public, as I am it being corrupted or made unavailable.
The machine itself must have some redundancy. Additionally, backups to a local device, and backups to an offsite location are required.
I don’t want to spend a lot of time in an SSH session trying to keep things working. I’d like some nice admin interfaces and automation. It’s not that I can’t do those things, but my goal here is to make a game, not play sysop.
Things are really fast these days. If I can’t get at least close to saturating whatever connection I’m using, it’s no good. Also, I’d like it even faster, so maybe I host it locally (on at least gigabit, with the option of going fiber) to satisfy the most likely access scenario.
These aren’t really about “compliance” or “code health”. It’s mainly because I’d like contributors to have the option of having a second set of eyes on their code. For both learning and discussion.
What we had: Subversion (in the cloud)
Probably about 2 years ago, I set up an Amazon Lightsail instance with a domain, SSL certificate, and an Apache/SVN setup. Backups, Network Security, all that was handled by Amazon. Honestly it worked really well. If you have a need for a mostly automated, premium hosting solution, I heartily recommend it. We’ll still use for it any lightweight (offsite) servers we need.
It satisfied all my requirements except #5 (and technically #2). Subversion doesn’t easily lend itself to code reviews, and the backups were all on AWS. But there were still a few problems:
- It still required some administration. It’s on the internet, so I’ve got to keep security packages up to date. Occasionally the SSL cert would fail to update (I was using Let’s Encrypt). Any changes to users or repositories still required me to log into the server.
- It’s on the internet. Great for remote access, less great for general peace of mind.
- It satisfied #4 right up until I decided that going forward, I wanted to include the entire Unreal Engine 4 (UE4) in source form along side the game code. Which involved going from 1-2GB to 20GB+. And probably would have caused issues in the long run anyway. Our connection is pretty great, but not that great given the datasets we’re talking about (and how “efficient” Subversion seems to be at network traffic).
- I eventually found a solution for #5, but that was after I had begun my investigations into alternatives.
- Not listed as a requirement, but as I expanded our needs (CI/CD or similar), it may not scale as well.
I really wanted to move everything to local-network (Gigabit+ LAN) with optional VPN access to the entire network if remote access is required.
Because I wanted things on the local network, that meant I finally needed a local server. For testing purposes I grabbed an old Laptop (Dell Studio 15) and installed UnRaid on it.
It was my first experience with UnRaid, and I was super impressed. Provision a USB stick for it, and plug it in. Find the IP address of the machine once it’s started up, point your browser there, and you’ve got a NAS with super-powers. You can easily configure drives with redundancy, docker containers, and Virtual Machines. All within a relatively nice web interface.
It’s paid for software. But in my opinion, it’s very reasonable (one-off $59-$129 depending on the number of attached storage devices). But you get a 30 day trial anyway, so give it a shot.
This gave me a platform to quickly try things out and blow them away when things went wrong, or I changed direction. Now that I know what I need, I’ll move this all over to a “proper” server (either something from Bargain Hardware UK or an unused machine I have lying around with some additional disks).
Option 1: Perforce
I use Perforce at work, and am fairly rustly with it. So the idea of having an excuse to get more familiar with it sounded appealing. It’s also what all the AAA studios use for UE4, so support will be top-notch (especially given how integrated Source Control is with UE4). So I figured I’d try out the free 5 user version of Helix Core (the new name for Perforce).
Initial setup was relatively simple for both the Helix Core and Helix Swarm Servers (the latter being the code review system that integrates closely with Perforce). But I ran into two major problems.
It’s still a super manual process. The Admin tool is fairly barebones, most of the configuration still happens through text configs. Any backup solution would need to be home grown. Perforce can generate checkpoints (full state) and journals (incremental), but honestly I didn’t even get that far.
The way you configure clients and depots/streams is, for some reason, completely incompatible with the way my brain works. I probably could have survived with a basic depot and called it a day. But I really liked the idea of having the engine as a separate stream that integrated with the game stream. And I simply gave up.
I liked that the typemap set file behaviour at a depot/stream level. But everything else from ignore files to user authentication and even terminology just feels outdated.
- Easy enough install
- Solid UE4 integration
- Code Review via Swarm
- Even over a VPN, I know pulls/commits aren’t bad
- Complicated depot/stream/client setup (beyond a base-level)
- Confusing terminology
- Manual server administration
Even if I got it working to a level I was happy with, it was apparent that there would be a sizable amount of administration going forward.
If you’re intimately familar with Perforce already, I’d actually highly recommend it. Adminstration is still a bit onerous, but the support staff were very responsive (although completely failed to understand what I meant by “hidden costs” whilst refusing to tell me what it’d cost once we went past the free limit (5 users, 20 workspaces)).
Ultimately, I decided I would leave it alone with a new-found appreciation for the Perforce wranglers at work.
Option 2: Subversion (again)
I like how simple Subversion is to get set up. So it seemed like it was worth giving it another shot. But I didn’t want to manually administer it anymore.
I managed to find that CollabNet have a package called Subversion Edge which is an all-in one package including the HTTP server, and a web interface for administering the subversion install (including Backup cronjobs, users, repositories, and email). They even have a docker container for it.
I also wanted Code Review capability. Given that Subversion doesn’t have a native way to submit something to the server that isn’t in the main stream, so it’s available for review, I knew I was going to have a problem.
I found a project called Review Board which supports a bunch of repository types, but interestingly Subversion is among them. How it works is once Review Board is set up, and configured within the repository, you can submit anything you have marked for commit within Subversion to Review Board as a post request.
This creates a diff on the Review Board UI, where you can select reviewers, add a description, and emails those involved. Reviewers can add comments. You can then do further updates and re-sync that request as needed. Although the final merge is you finally committing within Subversion. It actually works really well. The only downside is that unless you use shelving or similar, you’re kind of blocked on further work until the review is finished.
That all worked well, once I got it running. But my problems came down to Subversion itself.
The first problem was something I’d forgotten about, setting up ignores, and file-locking rules is a bit clunky.
But the main problem was that even over a LAN, committing the entire UE4 source (5-6 GB) took forever and ultimately timed out. Even just data transfers were painfully slow. I also wanted all the dependencies and such associated with UE4 included, which bumps it up 17-20GB. So that just didn’t seem feasible.
So subversion could work, but it wasn’t fast enough for a robust solution.
- Easy Setup
- Integrated Solution
- Web Admin Interface
- Functional Code Review Framework
- Fiddly Review Process
Option 3: Git
I wasn’t happy with the idea of moving to Git. But it’s something I’m initimately familiar with. But Steve Streeting had done a bunch of work beforehand, so I was aware of some interesting solutions in the space, and thought I should start there.
Steve had done the research, and ultimately decided on Gitea. Gitea is basically a self-hosted GitHub. It handles Git LFS 2 locking. And it just works. I had to set up a mysql docker container, and a Gitea docker container. Literally took minutes to configure.
I’ll have to write some scripts for doing backups, but Steve has already provided some that will form the basis for that.
I was initially concerned that the limitations Steve outlines would be a problem, but given that I was considering Subversion, they seem workable. I was also concerned that Git would cause problems with the other team members because it tends to have a higher learning curve than other solutions. But I’ve spent 5 years supporting other team members using Git. So I’m less worried.
Gitea also provides a solution for Pull Requests (and therefore Code Review), and it all seems to work. There’ll be some teething problems, but I’m sure we’ll overcome it.
The speed I can clone a 20GB repository across the network is reasonable (35-50MB/s), so a fresh clone takes about 6-ish minutes.
Gitea itself handles the admin. I can create repos and users easily, do basic maintenance tasks, etc.
On the UE4 side, I’m using Steve’s fork of the Git LFS 2 plugin, and it seems to work well enough.
So thanks for Steve for doing the hard work here, it saved me a lot of time :D
- Easy setup
- Intuitive administration
- Good UE4 Integration
- Clear Backup Strategy
- Distributed nature requires workarounds
I’ve got to get off-site backups working (probably to S3/Glacier). I’ve also got to move it off my Laptop. But I’m relatively happy that this solution has ticked all the boxes.
- Secure: It’s hosted on our local network. It has strong password access. I haven’t set up a VPN gateway yet, but that’ll be the only failure point.
- Redundant Backups: It’s fairly trivial to get this working. The array itself is redundant. And once I have on/offsite backups happening, we’ll be set.
- Minimal Admin: I’ve got a nice UI to make most changes. Docker manages the images themselves. And it all being local means I don’t have to worry too much about that specific component (Just usual local network security).
- Fast: I’ve been really happy with its ability to handle massive commits/clones. No issues whatsoever, and it’s all so-far within “get a coffee” timeframes.
- Code Reviews: Gitea allows pull requests into the main branch. So source commits can go through a code review process. Binary commits have special requirements (as outlined by Steve), so they don’t apply.