I am running ansible playbook from system 1 which runs tasks on system 2 to take backup and after that, I want to copy backup file from system 2 to system 3.
I am doing this task for automating below command
where /bck1/test on system 2 and opt/backup on system 3
rsync -r -v -e ssh /bck1/test.* root@host3:/opt/backup
You can run the raw rsync command with the shell
module.
tasks:
- shell: rsync -r -v -e ssh /bck1/test.* root@host3:/opt/backup
For this to work, you will either need to have your private ssh key deployed to system 2, or, preferable enable ssh agent forwarding, for example in your .ssh/config
:
Host host2
ForwardAgent yes
Additionally sshd on system 2 would need to accept agent forwarding. Here are some tasks which I use to do this:
- name: Ensure sshd allows agent forwarding
lineinfile: dest=/etc/ssh/sshd_config
regexp=^#?AllowAgentForwarding
line="AllowAgentForwarding yes"
follow=yes
backup=yes
sudo: yes
register: changed_sshd_config
- name: "Debian: Restart sshd"
shell: invoke-rc.d ssh restart
sudo: yes
when:
- ansible_distribution in [ "Debian", "Ubuntu" ]
- changed_sshd_config | changed
- name: "CentOS 7: Restart sshd"
shell: systemctl restart sshd.service
sudo: yes
when:
- ansible_distribution == "CentOS"
- ansible_distribution_major_version == "7"
- changed_sshd_config | changed
There are two separate tasks for restarting sshd on Debian and CentOS7. Pick what you need or maybe you have to adapt that to your system.
You might need to configure this in a separate playbook. Because Ansible will keep an open ssh connection to the host and after activating agent forwarding you most probably will need to re-connect.
PS: It's not the best idea to allow ssh login for user root, but that is another topic. :)