As Network Automation Engineer I had to deal with complex CI/CD workflows for provisioning and maintaining multi-brand networks made up of hundreds of networking devices. In most cases (read — always) the CI/CD pipeline does template rendering, device configuration backups and collects pre and post check tests. While I could see the satisfying pipeline results via AWX (the Ansible Tower community version) I had to dig into the Docker containers in order to extract the files generated during the provisioning process.
Having a CI/CD workflow, most of the time it means having temporary folders with weird names and scattered files in container filesystem. So, in those few cases where things did not work as expected (read — often) or someone needed to dig into test results files, I had just to log into the appropriate container, run some magic commands or regex to find where the wanted file was located and scp that file wherever I could (read cat, copy and paste). I know, I could have mounted an external docker volume where I could map the container folder where the files were generated but still, it required some ssh and scp commands in order to deliver the files to the requester. Most important thing though, I could not source control all the file generated during the CI/CD run.
Mater artium necessitas. For all the reasons above (including not having people asking me to send them files result every day) I decided to add a further stage in the workflow where all the files generated during pipeline iteration could have been pushed on GIT. Having Ansible as main gear for automation, I decided to look for a some sort of git module that could allow me to add, commit and push files. While I was googling, I stumbled on this PR where some fellows DevOps/Automation Engineers had the same problem as mine. The solution proposed was to write ad-hoc module that could handle A(dd)C(ommit)P(ush) operations. Scrolling down the page, I realized that since December nobody began to implement that module. “Great! Let me do that! I love to write reliable crap and to have the false belief that I’ve built something useful for the community”. Sleeves on, editor open and after a few days I came up with git_acp
Ansible module (have a look here).
For different reasons (we might talk about that in another post) I decided to do withdraw my merge request from Ansible repo and building a pip
package to make the module easy to install for however wanted to use it.
A few git_acp
examples extracted from README.md
- name: HTTPS | push all changes.
git_acp:
user: Federico87
token: mytoken
path: /Users/git/git_acp
branch: master
comment: Add all the thinghs.
add: [ "." ]
mode: https
url: "https://gitlab.com/networkAutomation/git_test_module.git"
- name: SSH | push file1 and file2.
git_acp:
path: /Users/git/git_acp
branch: master
comment: Add file1 and file2.
add: [ file1, file2 ]
mode: ssh
push_option: ci.skip
url: "[email protected]:networkAutomation/git_test_module.git"
- name: LOCAL | push file1 on local repo.
git_acp:
path: "~/test_directory/repo"
branch: master
comment: Add file1.
add: [ file1 ]
mode: local
url: /Users/federicoolivieri/test_directory/repo.git
Because pipeline is triggered at every git push or git merge, adding push_option
as module argument will not make me fall in an endless CI/CD loop every time that the new files are pushed on git at the end of the pipeline iteration.
I have extensively tested the module and I believe that all the the possible git error and fatal are handled (read — I have no idea). But I expect to see some bugs sooner or later so, please, raise issue whenever you find one. Also, I am more than happy to add features if you require them.