In setting this up, my thinking was that I'd work on my repo and have it privately stored with Gitlab. I'd then create a .gitlab-ci.yml
file which on every push it would trigger a push to a Google Source Repo which triggers actions to update the project's bucket.
I went through the Generating Static Credentials guide which seems to link to a very antiquated page to which instructs you to create a .gitcookies
file which will store your static credentials. I don't understand what it's giving me or how I can put those as a secret into Gitlab to use to trigger a push to a remote git repo.
This approach seems simpler given I didn't have to create credentials since everything in the GCR has appropriate access to that gcloud project but I may be totally off base here. But on the flip side, creating static credentials that I'd store in Gitlab which could then sync to the gcloud storage bucket didn't seem entirely possible to me from the documentation.
- Is what I'm trying to do possible?
- Is it actually easier to just create custom push-only credentials and have a Gitlab runner handle the bucket syncing?
Thank you!
If you have access to your own GitLab server, you could instead make sure git is using a credential helper, which will cache your credentials on the server side.
Or make the same operation on the GitLab build agent machine.
If not (meaning gitlab.com, no control/access to GitLab server/agent), you would need to include a script that will generate that gitcookie file in the sources of your repo, encrypted.
See for instance
mholt/caddy/dist/gitcookie.sh.enc
, which is used bymholt/caddy/.travis.yml#L17
in TravisCI (but can be adapted to GitLab CI)The
$encrypted_3df18f9af81d_key
and$encrypted_3df18f9af81d_iv
would be passed to the build at runtime.That is because TravisCI supports passing encrypted variables.
Similarly, GitLab-CI offers protected variables.
Example of a
gitcookie.sh
: