For people using slow, expensive, and/or flaky internet, liberal use of
caching can make a huge difference. The restricted environment of the
gpjenkins box has been a good test environment for this (Tor-only,
whitelist of allowed IPs to visit, home internet connection).
It seems that gitlab gives 500 errors a bit too frequently, so keep
retrying the `git pull` until it works so that there isn't a error email
sent out over failed pulls.
The ever troublesome gpjenkins box needs to use HTTPS mirrors. Plus it
improves the security of the buildserver, since there have been CVEs that
HTTPS would protect against:
https://www.debian.org/security/2016/dsa-3733
The other apps are too flaky on gpjenkins right now, and that's our
only box for running full buildserver tests. Once we get the
buildserver tests running on jenkins.debian.net, then we can add a
bunch more apps to the test script. gpjenkins is an extra locked down
box, so that's why the builds are flaky: gradle and maven downloads
regularly fail because they are blocked.
Turns out this one is a pain to get running on the locked down Guardian
Project Jenkins box since it uses git:// rather than https:// for the git
submodules.
org.xcsoar hosts its own git repo. Self-hosted git repos can be flaky, and
they are blocked by the Guardian Project jenkins server, so use an app from
github.com instead.
on the GP jenkins, I got this:
+ git pull
From https://gitlab.com/fdroid/fdroiddata
1df2d03..621ef4f master -> origin/master
You are not currently on a branch. Please specify which
branch you want to merge with. See git-pull(1) for details.
git pull <remote> <branch>
Also, when cloning, no need to specify the branch and only download that
one. We already have only a single branch. And forcing master isn't
necessary.