16380 1727204138.49169: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 16380 1727204138.49536: Added group all to inventory 16380 1727204138.49538: Added group ungrouped to inventory 16380 1727204138.49544: Group all now contains ungrouped 16380 1727204138.49547: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 16380 1727204138.62535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 16380 1727204138.62586: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 16380 1727204138.62607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 16380 1727204138.62661: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 16380 1727204138.62722: Loaded config def from plugin (inventory/script) 16380 1727204138.62724: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 16380 1727204138.62760: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 16380 1727204138.62831: Loaded config def from plugin (inventory/yaml) 16380 1727204138.62834: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 16380 1727204138.62908: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 16380 1727204138.63263: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 16380 1727204138.63266: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 16380 1727204138.63270: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 16380 1727204138.63275: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 16380 1727204138.63279: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 16380 1727204138.63336: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 16380 1727204138.63388: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 16380 1727204138.63425: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 16380 1727204138.63491: group all already in inventory 16380 1727204138.63498: set inventory_file for managed-node1 16380 1727204138.63502: set inventory_dir for managed-node1 16380 1727204138.63503: Added host managed-node1 to inventory 16380 1727204138.63504: Added host managed-node1 to group all 16380 1727204138.63505: set ansible_host for managed-node1 16380 1727204138.63506: set ansible_ssh_extra_args for managed-node1 16380 1727204138.63508: set inventory_file for managed-node2 16380 1727204138.63511: set inventory_dir for managed-node2 16380 1727204138.63512: Added host managed-node2 to inventory 16380 1727204138.63513: Added host managed-node2 to group all 16380 1727204138.63514: set ansible_host for managed-node2 16380 1727204138.63514: set ansible_ssh_extra_args for managed-node2 16380 1727204138.63516: set inventory_file for managed-node3 16380 1727204138.63518: set inventory_dir for managed-node3 16380 1727204138.63518: Added host managed-node3 to inventory 16380 1727204138.63519: Added host managed-node3 to group all 16380 1727204138.63520: set ansible_host for managed-node3 16380 1727204138.63521: set ansible_ssh_extra_args for managed-node3 16380 1727204138.63523: Reconcile groups and hosts in inventory. 16380 1727204138.63526: Group ungrouped now contains managed-node1 16380 1727204138.63527: Group ungrouped now contains managed-node2 16380 1727204138.63529: Group ungrouped now contains managed-node3 16380 1727204138.63593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 16380 1727204138.63695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 16380 1727204138.63739: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 16380 1727204138.63761: Loaded config def from plugin (vars/host_group_vars) 16380 1727204138.63763: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 16380 1727204138.63769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 16380 1727204138.63775: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 16380 1727204138.63813: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 16380 1727204138.64078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204138.64157: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 16380 1727204138.64188: Loaded config def from plugin (connection/local) 16380 1727204138.64192: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 16380 1727204138.64714: Loaded config def from plugin (connection/paramiko_ssh) 16380 1727204138.64717: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 16380 1727204138.65440: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 16380 1727204138.65474: Loaded config def from plugin (connection/psrp) 16380 1727204138.65477: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 16380 1727204138.66060: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 16380 1727204138.66093: Loaded config def from plugin (connection/ssh) 16380 1727204138.66096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 16380 1727204138.67684: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 16380 1727204138.67718: Loaded config def from plugin (connection/winrm) 16380 1727204138.67721: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 16380 1727204138.67747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 16380 1727204138.67804: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 16380 1727204138.67881: Loaded config def from plugin (shell/cmd) 16380 1727204138.67883: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 16380 1727204138.67906: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 16380 1727204138.67961: Loaded config def from plugin (shell/powershell) 16380 1727204138.67963: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 16380 1727204138.68010: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 16380 1727204138.68156: Loaded config def from plugin (shell/sh) 16380 1727204138.68158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 16380 1727204138.68187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 16380 1727204138.68295: Loaded config def from plugin (become/runas) 16380 1727204138.68297: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 16380 1727204138.68451: Loaded config def from plugin (become/su) 16380 1727204138.68453: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 16380 1727204138.68585: Loaded config def from plugin (become/sudo) 16380 1727204138.68587: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 16380 1727204138.68620: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 16380 1727204138.68893: in VariableManager get_vars() 16380 1727204138.68911: done with get_vars() 16380 1727204138.69020: trying /usr/local/lib/python3.12/site-packages/ansible/modules 16380 1727204138.71346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 16380 1727204138.71439: in VariableManager get_vars() 16380 1727204138.71443: done with get_vars() 16380 1727204138.71445: variable 'playbook_dir' from source: magic vars 16380 1727204138.71446: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.71447: variable 'ansible_config_file' from source: magic vars 16380 1727204138.71448: variable 'groups' from source: magic vars 16380 1727204138.71449: variable 'omit' from source: magic vars 16380 1727204138.71449: variable 'ansible_version' from source: magic vars 16380 1727204138.71450: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.71451: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.71452: variable 'ansible_forks' from source: magic vars 16380 1727204138.71453: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.71453: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.71454: variable 'ansible_limit' from source: magic vars 16380 1727204138.71454: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.71455: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.71485: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 16380 1727204138.71821: in VariableManager get_vars() 16380 1727204138.71834: done with get_vars() 16380 1727204138.71861: in VariableManager get_vars() 16380 1727204138.71870: done with get_vars() 16380 1727204138.71904: in VariableManager get_vars() 16380 1727204138.71915: done with get_vars() 16380 1727204138.71973: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16380 1727204138.72158: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16380 1727204138.72266: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16380 1727204138.72791: in VariableManager get_vars() 16380 1727204138.72807: done with get_vars() 16380 1727204138.73168: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 16380 1727204138.73301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16380 1727204138.74381: in VariableManager get_vars() 16380 1727204138.74384: done with get_vars() 16380 1727204138.74386: variable 'playbook_dir' from source: magic vars 16380 1727204138.74386: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.74387: variable 'ansible_config_file' from source: magic vars 16380 1727204138.74387: variable 'groups' from source: magic vars 16380 1727204138.74388: variable 'omit' from source: magic vars 16380 1727204138.74390: variable 'ansible_version' from source: magic vars 16380 1727204138.74391: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.74392: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.74393: variable 'ansible_forks' from source: magic vars 16380 1727204138.74394: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.74395: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.74395: variable 'ansible_limit' from source: magic vars 16380 1727204138.74396: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.74396: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.74431: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 16380 1727204138.74571: in VariableManager get_vars() 16380 1727204138.74591: done with get_vars() 16380 1727204138.74639: in VariableManager get_vars() 16380 1727204138.74643: done with get_vars() 16380 1727204138.74646: variable 'playbook_dir' from source: magic vars 16380 1727204138.74648: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.74649: variable 'ansible_config_file' from source: magic vars 16380 1727204138.74650: variable 'groups' from source: magic vars 16380 1727204138.74650: variable 'omit' from source: magic vars 16380 1727204138.74651: variable 'ansible_version' from source: magic vars 16380 1727204138.74652: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.74653: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.74654: variable 'ansible_forks' from source: magic vars 16380 1727204138.74655: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.74656: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.74657: variable 'ansible_limit' from source: magic vars 16380 1727204138.74658: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.74659: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.74703: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 16380 1727204138.74794: in VariableManager get_vars() 16380 1727204138.74821: done with get_vars() 16380 1727204138.74892: in VariableManager get_vars() 16380 1727204138.74899: done with get_vars() 16380 1727204138.74902: variable 'playbook_dir' from source: magic vars 16380 1727204138.74903: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.74904: variable 'ansible_config_file' from source: magic vars 16380 1727204138.74905: variable 'groups' from source: magic vars 16380 1727204138.74906: variable 'omit' from source: magic vars 16380 1727204138.74907: variable 'ansible_version' from source: magic vars 16380 1727204138.74908: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.74909: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.74910: variable 'ansible_forks' from source: magic vars 16380 1727204138.74916: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.74917: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.74919: variable 'ansible_limit' from source: magic vars 16380 1727204138.74920: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.74921: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.74962: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 16380 1727204138.75059: in VariableManager get_vars() 16380 1727204138.75068: done with get_vars() 16380 1727204138.75070: variable 'playbook_dir' from source: magic vars 16380 1727204138.75071: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.75072: variable 'ansible_config_file' from source: magic vars 16380 1727204138.75073: variable 'groups' from source: magic vars 16380 1727204138.75074: variable 'omit' from source: magic vars 16380 1727204138.75075: variable 'ansible_version' from source: magic vars 16380 1727204138.75077: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.75080: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.75081: variable 'ansible_forks' from source: magic vars 16380 1727204138.75082: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.75083: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.75086: variable 'ansible_limit' from source: magic vars 16380 1727204138.75087: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.75088: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.75145: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 16380 1727204138.75258: in VariableManager get_vars() 16380 1727204138.75276: done with get_vars() 16380 1727204138.75356: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16380 1727204138.75576: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16380 1727204138.75721: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16380 1727204138.76365: in VariableManager get_vars() 16380 1727204138.76380: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16380 1727204138.77880: in VariableManager get_vars() 16380 1727204138.77894: done with get_vars() 16380 1727204138.77923: in VariableManager get_vars() 16380 1727204138.77925: done with get_vars() 16380 1727204138.77927: variable 'playbook_dir' from source: magic vars 16380 1727204138.77927: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.77928: variable 'ansible_config_file' from source: magic vars 16380 1727204138.77928: variable 'groups' from source: magic vars 16380 1727204138.77929: variable 'omit' from source: magic vars 16380 1727204138.77930: variable 'ansible_version' from source: magic vars 16380 1727204138.77930: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.77931: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.77931: variable 'ansible_forks' from source: magic vars 16380 1727204138.77932: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.77933: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.77933: variable 'ansible_limit' from source: magic vars 16380 1727204138.77934: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.77934: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.77959: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 16380 1727204138.78017: in VariableManager get_vars() 16380 1727204138.78027: done with get_vars() 16380 1727204138.78070: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 16380 1727204138.78233: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 16380 1727204138.78292: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 16380 1727204138.79931: in VariableManager get_vars() 16380 1727204138.80009: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16380 1727204138.82542: in VariableManager get_vars() 16380 1727204138.82547: done with get_vars() 16380 1727204138.82550: variable 'playbook_dir' from source: magic vars 16380 1727204138.82551: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.82552: variable 'ansible_config_file' from source: magic vars 16380 1727204138.82553: variable 'groups' from source: magic vars 16380 1727204138.82554: variable 'omit' from source: magic vars 16380 1727204138.82556: variable 'ansible_version' from source: magic vars 16380 1727204138.82557: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.82558: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.82559: variable 'ansible_forks' from source: magic vars 16380 1727204138.82560: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.82561: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.82562: variable 'ansible_limit' from source: magic vars 16380 1727204138.82563: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.82564: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.82620: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 16380 1727204138.82718: in VariableManager get_vars() 16380 1727204138.82753: done with get_vars() 16380 1727204138.82801: in VariableManager get_vars() 16380 1727204138.82805: done with get_vars() 16380 1727204138.82808: variable 'playbook_dir' from source: magic vars 16380 1727204138.82809: variable 'ansible_playbook_python' from source: magic vars 16380 1727204138.82810: variable 'ansible_config_file' from source: magic vars 16380 1727204138.82811: variable 'groups' from source: magic vars 16380 1727204138.82812: variable 'omit' from source: magic vars 16380 1727204138.82813: variable 'ansible_version' from source: magic vars 16380 1727204138.82814: variable 'ansible_check_mode' from source: magic vars 16380 1727204138.82815: variable 'ansible_diff_mode' from source: magic vars 16380 1727204138.82816: variable 'ansible_forks' from source: magic vars 16380 1727204138.82816: variable 'ansible_inventory_sources' from source: magic vars 16380 1727204138.82822: variable 'ansible_skip_tags' from source: magic vars 16380 1727204138.82823: variable 'ansible_limit' from source: magic vars 16380 1727204138.82824: variable 'ansible_run_tags' from source: magic vars 16380 1727204138.82825: variable 'ansible_verbosity' from source: magic vars 16380 1727204138.82872: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 16380 1727204138.82972: in VariableManager get_vars() 16380 1727204138.82993: done with get_vars() 16380 1727204138.83095: in VariableManager get_vars() 16380 1727204138.83110: done with get_vars() 16380 1727204138.83298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 16380 1727204138.83317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 16380 1727204138.83721: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 16380 1727204138.84325: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 16380 1727204138.84332: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 16380 1727204138.84370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 16380 1727204138.84506: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 16380 1727204138.84942: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 16380 1727204138.85172: Loaded config def from plugin (callback/default) 16380 1727204138.85176: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 16380 1727204138.88544: Loaded config def from plugin (callback/junit) 16380 1727204138.88548: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 16380 1727204138.88613: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 16380 1727204138.88892: Loaded config def from plugin (callback/minimal) 16380 1727204138.88896: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 16380 1727204138.88949: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 16380 1727204138.89150: Loaded config def from plugin (callback/tree) 16380 1727204138.89154: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 16380 1727204138.89569: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 16380 1727204138.89572: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 16380 1727204138.89713: in VariableManager get_vars() 16380 1727204138.89738: done with get_vars() 16380 1727204138.89748: in VariableManager get_vars() 16380 1727204138.89761: done with get_vars() 16380 1727204138.89767: variable 'omit' from source: magic vars 16380 1727204138.89824: in VariableManager get_vars() 16380 1727204138.89959: done with get_vars() 16380 1727204138.90061: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 16380 1727204138.91533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 16380 1727204138.91737: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 16380 1727204138.91920: getting the remaining hosts for this loop 16380 1727204138.91922: done getting the remaining hosts for this loop 16380 1727204138.91926: getting the next task for host managed-node2 16380 1727204138.91931: done getting next task for host managed-node2 16380 1727204138.91934: ^ task is: TASK: Gathering Facts 16380 1727204138.91936: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204138.91939: getting variables 16380 1727204138.91942: in VariableManager get_vars() 16380 1727204138.91955: Calling all_inventory to load vars for managed-node2 16380 1727204138.91959: Calling groups_inventory to load vars for managed-node2 16380 1727204138.91962: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204138.91978: Calling all_plugins_play to load vars for managed-node2 16380 1727204138.91996: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204138.92001: Calling groups_plugins_play to load vars for managed-node2 16380 1727204138.92123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204138.92252: done with get_vars() 16380 1727204138.92261: done getting variables 16380 1727204138.92349: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.030) 0:00:00.030 ***** 16380 1727204138.92375: entering _queue_task() for managed-node2/gather_facts 16380 1727204138.92377: Creating lock for gather_facts 16380 1727204138.93332: worker is 1 (out of 1 available) 16380 1727204138.93344: exiting _queue_task() for managed-node2/gather_facts 16380 1727204138.93406: done queuing things up, now waiting for results queue to drain 16380 1727204138.93409: waiting for pending results... 16380 1727204138.94113: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204138.94120: in run() - task 12b410aa-8751-749c-b6eb-00000000007e 16380 1727204138.94123: variable 'ansible_search_path' from source: unknown 16380 1727204138.94127: calling self._execute() 16380 1727204138.94496: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204138.94500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204138.94503: variable 'omit' from source: magic vars 16380 1727204138.94898: variable 'omit' from source: magic vars 16380 1727204138.94902: variable 'omit' from source: magic vars 16380 1727204138.94904: variable 'omit' from source: magic vars 16380 1727204138.94906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204138.94912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204138.95108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204138.95144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204138.95164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204138.95211: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204138.95304: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204138.95319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204138.95558: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204138.95574: Set connection var ansible_shell_executable to /bin/sh 16380 1727204138.95588: Set connection var ansible_connection to ssh 16380 1727204138.95604: Set connection var ansible_shell_type to sh 16380 1727204138.95894: Set connection var ansible_pipelining to False 16380 1727204138.95898: Set connection var ansible_timeout to 10 16380 1727204138.95900: variable 'ansible_shell_executable' from source: unknown 16380 1727204138.95903: variable 'ansible_connection' from source: unknown 16380 1727204138.95905: variable 'ansible_module_compression' from source: unknown 16380 1727204138.95917: variable 'ansible_shell_type' from source: unknown 16380 1727204138.95924: variable 'ansible_shell_executable' from source: unknown 16380 1727204138.95932: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204138.95941: variable 'ansible_pipelining' from source: unknown 16380 1727204138.95949: variable 'ansible_timeout' from source: unknown 16380 1727204138.95957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204138.96383: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204138.96404: variable 'omit' from source: magic vars 16380 1727204138.96420: starting attempt loop 16380 1727204138.96428: running the handler 16380 1727204138.96517: variable 'ansible_facts' from source: unknown 16380 1727204138.96545: _low_level_execute_command(): starting 16380 1727204138.96559: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204138.98167: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204138.98280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204138.98308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204138.98522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204138.98527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204138.98572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204139.00456: stdout chunk (state=3): >>>/root <<< 16380 1727204139.00694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204139.00764: stderr chunk (state=3): >>><<< 16380 1727204139.00775: stdout chunk (state=3): >>><<< 16380 1727204139.00807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204139.00915: _low_level_execute_command(): starting 16380 1727204139.00929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892 `" && echo ansible-tmp-1727204139.0089679-16477-54798325601892="` echo /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892 `" ) && sleep 0' 16380 1727204139.02413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204139.04533: stdout chunk (state=3): >>>ansible-tmp-1727204139.0089679-16477-54798325601892=/root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892 <<< 16380 1727204139.04649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204139.04905: stderr chunk (state=3): >>><<< 16380 1727204139.04915: stdout chunk (state=3): >>><<< 16380 1727204139.04941: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204139.0089679-16477-54798325601892=/root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204139.04984: variable 'ansible_module_compression' from source: unknown 16380 1727204139.05158: ANSIBALLZ: Using generic lock for ansible.legacy.setup 16380 1727204139.05168: ANSIBALLZ: Acquiring lock 16380 1727204139.05176: ANSIBALLZ: Lock acquired: 140602939598528 16380 1727204139.05184: ANSIBALLZ: Creating module 16380 1727204139.83661: ANSIBALLZ: Writing module into payload 16380 1727204139.83883: ANSIBALLZ: Writing module 16380 1727204139.83942: ANSIBALLZ: Renaming module 16380 1727204139.83962: ANSIBALLZ: Done creating module 16380 1727204139.84014: variable 'ansible_facts' from source: unknown 16380 1727204139.84028: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204139.84048: _low_level_execute_command(): starting 16380 1727204139.84060: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 16380 1727204139.84848: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204139.84869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204139.84887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204139.84984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204139.85024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204139.85042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204139.85094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204139.85224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204139.87013: stdout chunk (state=3): >>>PLATFORM <<< 16380 1727204139.87085: stdout chunk (state=3): >>>Linux <<< 16380 1727204139.87113: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 16380 1727204139.87136: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 16380 1727204139.87357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204139.87360: stdout chunk (state=3): >>><<< 16380 1727204139.87363: stderr chunk (state=3): >>><<< 16380 1727204139.87380: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204139.87531 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 16380 1727204139.87536: _low_level_execute_command(): starting 16380 1727204139.87538: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 16380 1727204139.87681: Sending initial data 16380 1727204139.87684: Sent initial data (1181 bytes) 16380 1727204139.88141: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204139.88226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204139.88323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204139.88362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204139.88399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204139.92423: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 16380 1727204139.92641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204139.93095: stderr chunk (state=3): >>><<< 16380 1727204139.93099: stdout chunk (state=3): >>><<< 16380 1727204139.93102: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204139.93104: variable 'ansible_facts' from source: unknown 16380 1727204139.93106: variable 'ansible_facts' from source: unknown 16380 1727204139.93111: variable 'ansible_module_compression' from source: unknown 16380 1727204139.93335: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204139.93339: variable 'ansible_facts' from source: unknown 16380 1727204139.93697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py 16380 1727204139.94263: Sending initial data 16380 1727204139.94273: Sent initial data (153 bytes) 16380 1727204139.95271: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204139.95288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204139.95313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204139.95343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204139.95403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204139.95461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204139.95480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204139.95499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204139.95576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204139.97396: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204139.97439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204139.97529: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp0vv1fnv7 /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py <<< 16380 1727204139.97554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp0vv1fnv7" to remote "/root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py" <<< 16380 1727204140.04077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204140.04797: stderr chunk (state=3): >>><<< 16380 1727204140.04801: stdout chunk (state=3): >>><<< 16380 1727204140.04804: done transferring module to remote 16380 1727204140.04806: _low_level_execute_command(): starting 16380 1727204140.04808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/ /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py && sleep 0' 16380 1727204140.05843: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204140.05980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204140.06102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204140.06173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204140.08376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204140.08384: stdout chunk (state=3): >>><<< 16380 1727204140.08387: stderr chunk (state=3): >>><<< 16380 1727204140.08571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204140.08575: _low_level_execute_command(): starting 16380 1727204140.08578: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/AnsiballZ_setup.py && sleep 0' 16380 1727204140.10263: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204140.10591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204140.10610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204140.10632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204140.10719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204140.13200: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 16380 1727204140.13226: stdout chunk (state=3): >>>import _imp # builtin <<< 16380 1727204140.13256: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 16380 1727204140.13339: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 16380 1727204140.13388: stdout chunk (state=3): >>>import 'posix' # <<< 16380 1727204140.13422: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 16380 1727204140.13453: stdout chunk (state=3): >>>import 'time' # <<< 16380 1727204140.13465: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 16380 1727204140.13511: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 16380 1727204140.13543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 16380 1727204140.13573: stdout chunk (state=3): >>>import 'codecs' # <<< 16380 1727204140.13609: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 16380 1727204140.13650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 16380 1727204140.13673: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc8b44d0> <<< 16380 1727204140.13694: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc883ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc8b6a20> <<< 16380 1727204140.13721: stdout chunk (state=3): >>>import '_signal' # <<< 16380 1727204140.13763: stdout chunk (state=3): >>>import '_abc' # <<< 16380 1727204140.13779: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 16380 1727204140.13903: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 16380 1727204140.13951: stdout chunk (state=3): >>>import '_collections_abc' # <<< 16380 1727204140.13955: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 16380 1727204140.14353: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 16380 1727204140.14357: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6690a0> <<< 16380 1727204140.14360: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc669fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 16380 1727204140.14775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 16380 1727204140.14810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 16380 1727204140.14838: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 16380 1727204140.14857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 16380 1727204140.14912: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a7e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 16380 1727204140.14918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 16380 1727204140.14942: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a7ec0> <<< 16380 1727204140.14959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 16380 1727204140.15014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 16380 1727204140.15018: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 16380 1727204140.15108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 16380 1727204140.15194: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6df830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 16380 1727204140.15199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6dfec0> <<< 16380 1727204140.15224: stdout chunk (state=3): >>>import '_collections' # <<< 16380 1727204140.15307: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6bfad0> import '_functools' # <<< 16380 1727204140.15329: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6bd1f0> <<< 16380 1727204140.15386: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a4fb0> <<< 16380 1727204140.15426: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 16380 1727204140.15450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 16380 1727204140.15477: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 16380 1727204140.15539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 16380 1727204140.15543: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 16380 1727204140.15546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 16380 1727204140.15695: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc703770> <<< 16380 1727204140.15702: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc702390> <<< 16380 1727204140.15705: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6bfe30> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc700c50> <<< 16380 1727204140.15732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 16380 1727204140.15761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc734710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a4230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 16380 1727204140.15824: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc734bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc734a70> <<< 16380 1727204140.15842: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc734e60> <<< 16380 1727204140.15939: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a2d50> <<< 16380 1727204140.15963: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc735520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc735220> <<< 16380 1727204140.15995: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 16380 1727204140.16044: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc736420> <<< 16380 1727204140.16074: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 16380 1727204140.16358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc750650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc751d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc752c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc7532f0> <<< 16380 1727204140.16366: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc7521e0> <<< 16380 1727204140.16382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 16380 1727204140.16429: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.16450: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc753d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc7534a0> <<< 16380 1727204140.16486: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc736480> <<< 16380 1727204140.16521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 16380 1727204140.16551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 16380 1727204140.16571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 16380 1727204140.16586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 16380 1727204140.16646: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc487c50> <<< 16380 1727204140.16651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 16380 1727204140.16719: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.16729: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc4b4650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b43b0> <<< 16380 1727204140.16736: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc4b45c0> <<< 16380 1727204140.16780: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc4b4830> <<< 16380 1727204140.16784: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc485e20> <<< 16380 1727204140.16805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 16380 1727204140.16930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 16380 1727204140.16959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 16380 1727204140.17110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b5f40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b4bc0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc736600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204140.17132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 16380 1727204140.17177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 16380 1727204140.17217: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4e2270> <<< 16380 1727204140.17259: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 16380 1727204140.17284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204140.17429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 16380 1727204140.17455: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4fa3c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 16380 1727204140.17513: stdout chunk (state=3): >>>import 'ntpath' # <<< 16380 1727204140.17552: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc533170> <<< 16380 1727204140.17562: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 16380 1727204140.17610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 16380 1727204140.17624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 16380 1727204140.17676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 16380 1727204140.17774: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc559910> <<< 16380 1727204140.17856: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc533290> <<< 16380 1727204140.17896: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4fb050> <<< 16380 1727204140.17937: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc530980> <<< 16380 1727204140.17958: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4f9400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b6e10> <<< 16380 1727204140.18332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 16380 1727204140.18339: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6cbc330590> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_wry6_c8k/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 16380 1727204140.18495: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.18523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 16380 1727204140.18549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 16380 1727204140.18583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 16380 1727204140.18667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 16380 1727204140.18707: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 16380 1727204140.18769: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc396030> import '_typing' # <<< 16380 1727204140.18930: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc36cf20> <<< 16380 1727204140.18979: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc333fb0> # zipimport: zlib available <<< 16380 1727204140.19012: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204140.19100: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 16380 1727204140.20725: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.22086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc36fe90> <<< 16380 1727204140.22329: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204140.22353: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc3c99a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3c9730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3c9040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 16380 1727204140.22379: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3c9a60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc753230> import 'atexit' # <<< 16380 1727204140.22412: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc3ca6f0> <<< 16380 1727204140.22463: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc3ca930> <<< 16380 1727204140.22470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 16380 1727204140.22525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 16380 1727204140.22547: stdout chunk (state=3): >>>import '_locale' # <<< 16380 1727204140.22605: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3cae70> import 'pwd' # <<< 16380 1727204140.22663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 16380 1727204140.22768: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc22cbc0> <<< 16380 1727204140.22772: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc22e7e0> <<< 16380 1727204140.22802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc22f1a0> <<< 16380 1727204140.22871: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 16380 1727204140.22908: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc230380> <<< 16380 1727204140.22925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 16380 1727204140.22975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 16380 1727204140.23066: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc232de0> <<< 16380 1727204140.23087: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc232f00> <<< 16380 1727204140.23204: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc2310a0> <<< 16380 1727204140.23207: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 16380 1727204140.23233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 16380 1727204140.23307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 16380 1727204140.23321: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc236d80> import '_tokenize' # <<< 16380 1727204140.23434: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc235850> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc2355b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 16380 1727204140.23495: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc237ce0> <<< 16380 1727204140.23531: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc231520> <<< 16380 1727204140.23587: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc27af00> <<< 16380 1727204140.23608: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc27b020> <<< 16380 1727204140.23743: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 16380 1727204140.23765: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc280bc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc280980> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 16380 1727204140.23891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 16380 1727204140.23945: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc283110> <<< 16380 1727204140.23971: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc281250> <<< 16380 1727204140.24001: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 16380 1727204140.24223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc28a930> <<< 16380 1727204140.24299: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc2832c0> <<< 16380 1727204140.24381: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.24404: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28b7a0> <<< 16380 1727204140.24435: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28b7d0> <<< 16380 1727204140.24474: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.24513: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28b2f0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc27b320> <<< 16380 1727204140.24578: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 16380 1727204140.24600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 16380 1727204140.24630: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.24762: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28ea20> <<< 16380 1727204140.24850: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.24983: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28fe60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc28d1c0> <<< 16380 1727204140.25014: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28e0c0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc28cd70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 16380 1727204140.25097: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.25222: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204140.25246: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 16380 1727204140.25271: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 16380 1727204140.25424: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.25565: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.26275: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.26983: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 16380 1727204140.27014: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 16380 1727204140.27060: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 16380 1727204140.27073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204140.27252: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc118080> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 16380 1727204140.27259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 16380 1727204140.27280: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc119490> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3ca840> <<< 16380 1727204140.27336: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 16380 1727204140.27361: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.27397: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.27412: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 16380 1727204140.27632: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.27859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 16380 1727204140.27862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1195b0> <<< 16380 1727204140.27875: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.28603: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.28980: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29065: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29173: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 16380 1727204140.29221: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29259: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 16380 1727204140.29283: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29356: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29488: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 16380 1727204140.29524: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 16380 1727204140.29536: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29584: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29645: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 16380 1727204140.29648: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.29937: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.30234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 16380 1727204140.30330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 16380 1727204140.30334: stdout chunk (state=3): >>>import '_ast' # <<< 16380 1727204140.30432: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc11b440> <<< 16380 1727204140.30496: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.30528: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.30622: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 16380 1727204140.30707: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 16380 1727204140.30713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 16380 1727204140.30768: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.30975: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc121a90> <<< 16380 1727204140.30979: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc1223f0> <<< 16380 1727204140.31223: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc11a570> <<< 16380 1727204140.31226: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.31240: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204140.31262: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.31340: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 16380 1727204140.31390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204140.31492: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.31511: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc121250> <<< 16380 1727204140.31560: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1225a0> <<< 16380 1727204140.31658: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 16380 1727204140.31672: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.31727: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.31867: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.31893: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 16380 1727204140.31930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 16380 1727204140.31960: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 16380 1727204140.31986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 16380 1727204140.32036: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1ba810> <<< 16380 1727204140.32097: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc12c4a0> <<< 16380 1727204140.32244: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc12a660> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc12a420> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 16380 1727204140.32247: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32269: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 16380 1727204140.32336: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 16380 1727204140.32439: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204140.32510: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32550: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32553: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32595: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32643: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32679: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32714: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 16380 1727204140.32736: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32813: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32900: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.32924: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.33187: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 16380 1727204140.33194: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.33373: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.33418: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.33475: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 16380 1727204140.33631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c0da0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 16380 1727204140.33653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 16380 1727204140.33704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 16380 1727204140.33742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 16380 1727204140.33767: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70be60> <<< 16380 1727204140.33801: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.33872: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb70c1d0> <<< 16380 1727204140.33943: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc10cf20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc10c4d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c2c60> <<< 16380 1727204140.33977: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c2780> <<< 16380 1727204140.34016: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 16380 1727204140.34053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 16380 1727204140.34167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 16380 1727204140.34178: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb70f290> <<< 16380 1727204140.34203: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70eb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb70ed20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70df70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 16380 1727204140.34330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70f440> <<< 16380 1727204140.34333: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 16380 1727204140.34370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 16380 1727204140.34411: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.34429: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb779f70> <<< 16380 1727204140.34543: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70ff50> <<< 16380 1727204140.34547: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c2870> import 'ansible.module_utils.facts.timeout' # <<< 16380 1727204140.34558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 16380 1727204140.34606: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.34661: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 16380 1727204140.34732: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.34754: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.34858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 16380 1727204140.34862: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 16380 1727204140.34903: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 16380 1727204140.35006: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.35031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 16380 1727204140.35080: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.35123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 16380 1727204140.35139: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.35274: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.35294: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.35325: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.35404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 16380 1727204140.35504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 16380 1727204140.35979: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.36496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 16380 1727204140.36512: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.36558: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.36618: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.36655: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.36929: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 16380 1727204140.36932: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 16380 1727204140.36959: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.36986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 16380 1727204140.37011: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37049: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37151: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 16380 1727204140.37173: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37268: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 16380 1727204140.37297: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb77b380> <<< 16380 1727204140.37321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 16380 1727204140.37354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 16380 1727204140.37494: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb77a930> import 'ansible.module_utils.facts.system.local' # <<< 16380 1727204140.37515: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37580: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 16380 1727204140.37687: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37788: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.37872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 16380 1727204140.37947: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.38043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 16380 1727204140.38046: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.38117: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.38142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 16380 1727204140.38194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 16380 1727204140.38268: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.38342: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb7a61b0> <<< 16380 1727204140.38571: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb792000> import 'ansible.module_utils.facts.system.python' # <<< 16380 1727204140.38603: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.38638: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.38819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 16380 1727204140.38822: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.38902: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39027: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39195: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 16380 1727204140.39219: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39297: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 16380 1727204140.39367: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 16380 1727204140.39490: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204140.39558: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb5a5c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb7a5fa0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 16380 1727204140.39587: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 16380 1727204140.39602: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39720: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 16380 1727204140.39724: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.39838: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40007: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 16380 1727204140.40121: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40228: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40272: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 16380 1727204140.40365: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204140.40388: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40564: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 16380 1727204140.40790: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.40869: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.41015: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 16380 1727204140.41029: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.41075: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.41129: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.41771: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.42428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 16380 1727204140.42474: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.42595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 16380 1727204140.42615: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.42714: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.42904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 16380 1727204140.42921: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43013: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 16380 1727204140.43213: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43223: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 16380 1727204140.43252: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43297: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 16380 1727204140.43353: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43460: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43571: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.43817: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 16380 1727204140.44069: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44103: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 16380 1727204140.44181: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44213: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44224: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 16380 1727204140.44297: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 16380 1727204140.44406: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44454: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 16380 1727204140.44470: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44521: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 16380 1727204140.44659: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.44739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 16380 1727204140.44743: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.45119: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.45353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 16380 1727204140.45420: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.45551: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 16380 1727204140.45555: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.45590: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 16380 1727204140.45621: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.45707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 16380 1727204140.45769: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.45772: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 16380 1727204140.45876: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 16380 1727204140.46013: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46054: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204140.46096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 16380 1727204140.46138: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46142: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46190: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46248: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46320: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46420: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 16380 1727204140.46431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 16380 1727204140.46492: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46554: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 16380 1727204140.46557: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.46779: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 16380 1727204140.47067: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 16380 1727204140.47138: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47180: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47241: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 16380 1727204140.47245: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47334: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 16380 1727204140.47448: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47544: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.47649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 16380 1727204140.47741: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204140.48755: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 16380 1727204140.48800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 16380 1727204140.48842: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb5cf920> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb5ccb30> <<< 16380 1727204140.48912: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb5cc8f0> <<< 16380 1727204140.65983: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 16380 1727204140.66058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb614f80> <<< 16380 1727204140.66081: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 16380 1727204140.66086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 16380 1727204140.66088: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb615c70> <<< 16380 1727204140.66143: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204140.66184: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 16380 1727204140.66234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 16380 1727204140.66238: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb7ac470> <<< 16380 1727204140.66260: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb7ac950> <<< 16380 1727204140.66507: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 16380 1727204140.87040: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "40", "epoch": "1727204140", "epoch_int": "1727204140", "date": "2024-09-24", "time": "14:55:40", "iso8601_micro": "2024-09-24T18:55:40.482760Z", "iso8601": "2024-09-24T18:55:40Z", "iso8601_basic": "20240924T145540482760", "iso8601_basic_short": "20240924T145540", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/<<< 16380 1727204140.87140: stdout chunk (state=3): >>>root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.56005859375, "5m": 0.55029296875, "15m": 0.34228515625}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2843, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 874, "free": 2843}, "nocache": {"free": 3470, "used": 247}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 644, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156127744, "block_size": 4096, "block_total": 64479564, "block_available": 61317414, "block_used": 3162150, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204140.87745: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 16380 1727204140.87749: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 16380 1727204140.87885: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ <<< 16380 1727204140.87906: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob <<< 16380 1727204140.87949: stdout chunk (state=3): >>># cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos<<< 16380 1727204140.87992: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector <<< 16380 1727204140.88102: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user<<< 16380 1727204140.88105: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 16380 1727204140.88879: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 16380 1727204140.88921: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 16380 1727204140.88951: stdout chunk (state=3): >>># destroy argparse # destroy logging<<< 16380 1727204140.88965: stdout chunk (state=3): >>> <<< 16380 1727204140.89021: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors<<< 16380 1727204140.89026: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector <<< 16380 1727204140.89087: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy <<< 16380 1727204140.89092: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy signal # destroy pickle<<< 16380 1727204140.89132: stdout chunk (state=3): >>> # destroy _compat_pickle<<< 16380 1727204140.89135: stdout chunk (state=3): >>> # destroy _pickle <<< 16380 1727204140.89190: stdout chunk (state=3): >>># destroy queue <<< 16380 1727204140.89194: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction<<< 16380 1727204140.89221: stdout chunk (state=3): >>> <<< 16380 1727204140.89246: stdout chunk (state=3): >>># destroy selectors # destroy shlex # destroy fcntl<<< 16380 1727204140.89261: stdout chunk (state=3): >>> # destroy datetime <<< 16380 1727204140.89283: stdout chunk (state=3): >>># destroy subprocess # destroy base64<<< 16380 1727204140.89323: stdout chunk (state=3): >>> # destroy _ssl <<< 16380 1727204140.89362: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 16380 1727204140.89392: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios<<< 16380 1727204140.89418: stdout chunk (state=3): >>> # destroy json<<< 16380 1727204140.89465: stdout chunk (state=3): >>> # destroy socket <<< 16380 1727204140.89476: stdout chunk (state=3): >>># destroy struct <<< 16380 1727204140.89505: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 16380 1727204140.89530: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata<<< 16380 1727204140.89576: stdout chunk (state=3): >>> # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context<<< 16380 1727204140.89579: stdout chunk (state=3): >>> # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection<<< 16380 1727204140.89678: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.idna <<< 16380 1727204140.89701: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux<<< 16380 1727204140.89747: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian <<< 16380 1727204140.89783: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves<<< 16380 1727204140.89819: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 16380 1727204140.89822: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 16380 1727204140.89848: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform<<< 16380 1727204140.89887: stdout chunk (state=3): >>> # cleanup[3] wiping atexit<<< 16380 1727204140.89927: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 16380 1727204140.89930: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 16380 1727204140.89975: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 16380 1727204140.89979: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler<<< 16380 1727204140.90015: stdout chunk (state=3): >>> # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 16380 1727204140.90059: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools <<< 16380 1727204140.90070: stdout chunk (state=3): >>># cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc <<< 16380 1727204140.90100: stdout chunk (state=3): >>># cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 16380 1727204140.90142: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath <<< 16380 1727204140.90166: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix<<< 16380 1727204140.90200: stdout chunk (state=3): >>> <<< 16380 1727204140.90222: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 16380 1727204140.90255: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader <<< 16380 1727204140.90372: stdout chunk (state=3): >>># destroy systemd._journal # destroy _datetime <<< 16380 1727204140.90591: stdout chunk (state=3): >>># destroy sys.monitoring <<< 16380 1727204140.90603: stdout chunk (state=3): >>># destroy _socket <<< 16380 1727204140.90632: stdout chunk (state=3): >>># destroy _collections <<< 16380 1727204140.90678: stdout chunk (state=3): >>># destroy platform<<< 16380 1727204140.90720: stdout chunk (state=3): >>> # destroy _uuid # destroy stat # destroy genericpath <<< 16380 1727204140.90744: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 16380 1727204140.90790: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 16380 1727204140.90814: stdout chunk (state=3): >>> # destroy copyreg <<< 16380 1727204140.90856: stdout chunk (state=3): >>># destroy contextlib # destroy _typing <<< 16380 1727204140.90882: stdout chunk (state=3): >>># destroy _tokenize <<< 16380 1727204140.90901: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response<<< 16380 1727204140.90941: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 16380 1727204140.90958: stdout chunk (state=3): >>> # destroy _frozen_importlib_external # destroy _imp <<< 16380 1727204140.91022: stdout chunk (state=3): >>># destroy _io # destroy marshal # clear sys.meta_path <<< 16380 1727204140.91026: stdout chunk (state=3): >>># clear sys.modules <<< 16380 1727204140.91057: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 16380 1727204140.91176: stdout chunk (state=3): >>># destroy codecs<<< 16380 1727204140.91194: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 16380 1727204140.91217: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 16380 1727204140.91232: stdout chunk (state=3): >>> # destroy math # destroy _bisect <<< 16380 1727204140.91293: stdout chunk (state=3): >>># destroy time # destroy _random <<< 16380 1727204140.91305: stdout chunk (state=3): >>># destroy _weakref <<< 16380 1727204140.91332: stdout chunk (state=3): >>># destroy _operator<<< 16380 1727204140.91367: stdout chunk (state=3): >>> # destroy _sha2 # destroy _sre # destroy _string # destroy re<<< 16380 1727204140.91370: stdout chunk (state=3): >>> <<< 16380 1727204140.91423: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 16380 1727204140.91428: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 16380 1727204140.91451: stdout chunk (state=3): >>> # clear sys.audit hooks <<< 16380 1727204140.92342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204140.92349: stdout chunk (state=3): >>><<< 16380 1727204140.92351: stderr chunk (state=3): >>><<< 16380 1727204140.92711: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc8b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc883ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc8b6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6690a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc669fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a7e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a7ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6df830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6dfec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6bfad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6bd1f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a4fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc703770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc702390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6bfe30> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc700c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc734710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a4230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc734bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc734a70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc734e60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc6a2d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc735520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc735220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc736420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc750650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc751d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc752c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc7532f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc7521e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc753d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc7534a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc736480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc487c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc4b4650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b43b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc4b45c0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc4b4830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc485e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b5f40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b4bc0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc736600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4e2270> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4fa3c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc533170> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc559910> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc533290> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4fb050> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc530980> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4f9400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc4b6e10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6cbc330590> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_wry6_c8k/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc396030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc36cf20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc333fb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc36fe90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc3c99a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3c9730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3c9040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3c9a60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc753230> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc3ca6f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc3ca930> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3cae70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc22cbc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc22e7e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc22f1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc230380> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc232de0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc232f00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc2310a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc236d80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc235850> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc2355b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc237ce0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc231520> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc27af00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc27b020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc280bc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc280980> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc283110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc281250> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc28a930> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc2832c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28b7a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28b7d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28b2f0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc27b320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28ea20> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28fe60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc28d1c0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc28e0c0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc28cd70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc118080> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc119490> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc3ca840> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1195b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc11b440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc121a90> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc1223f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc11a570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbc121250> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1225a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1ba810> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc12c4a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc12a660> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc12a420> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c0da0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70be60> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb70c1d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc10cf20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc10c4d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c2c60> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c2780> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb70f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70eb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb70ed20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70df70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70f440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb779f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb70ff50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbc1c2870> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb77b380> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb77a930> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb7a61b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb792000> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb5a5c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb7a5fa0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6cbb5cf920> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb5ccb30> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb5cc8f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb614f80> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb615c70> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb7ac470> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6cbb7ac950> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "40", "epoch": "1727204140", "epoch_int": "1727204140", "date": "2024-09-24", "time": "14:55:40", "iso8601_micro": "2024-09-24T18:55:40.482760Z", "iso8601": "2024-09-24T18:55:40Z", "iso8601_basic": "20240924T145540482760", "iso8601_basic_short": "20240924T145540", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.56005859375, "5m": 0.55029296875, "15m": 0.34228515625}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2843, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 874, "free": 2843}, "nocache": {"free": 3470, "used": 247}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 644, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156127744, "block_size": 4096, "block_total": 64479564, "block_available": 61317414, "block_used": 3162150, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 16380 1727204140.95352: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204140.95412: _low_level_execute_command(): starting 16380 1727204140.95415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204139.0089679-16477-54798325601892/ > /dev/null 2>&1 && sleep 0' 16380 1727204140.96301: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204140.96305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204140.96518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204140.96581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204140.99437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204140.99441: stdout chunk (state=3): >>><<< 16380 1727204140.99449: stderr chunk (state=3): >>><<< 16380 1727204140.99467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204140.99477: handler run complete 16380 1727204140.99685: variable 'ansible_facts' from source: unknown 16380 1727204140.99835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.00358: variable 'ansible_facts' from source: unknown 16380 1727204141.00695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.00699: attempt loop complete, returning result 16380 1727204141.00702: _execute() done 16380 1727204141.00706: dumping result to json 16380 1727204141.00725: done dumping result, returning 16380 1727204141.00734: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-00000000007e] 16380 1727204141.00740: sending task result for task 12b410aa-8751-749c-b6eb-00000000007e 16380 1727204141.01082: done sending task result for task 12b410aa-8751-749c-b6eb-00000000007e 16380 1727204141.01085: WORKER PROCESS EXITING ok: [managed-node2] 16380 1727204141.01749: no more pending results, returning what we have 16380 1727204141.01753: results queue empty 16380 1727204141.01754: checking for any_errors_fatal 16380 1727204141.01756: done checking for any_errors_fatal 16380 1727204141.01757: checking for max_fail_percentage 16380 1727204141.01758: done checking for max_fail_percentage 16380 1727204141.01759: checking to see if all hosts have failed and the running result is not ok 16380 1727204141.01760: done checking to see if all hosts have failed 16380 1727204141.01762: getting the remaining hosts for this loop 16380 1727204141.01764: done getting the remaining hosts for this loop 16380 1727204141.01768: getting the next task for host managed-node2 16380 1727204141.01775: done getting next task for host managed-node2 16380 1727204141.01777: ^ task is: TASK: meta (flush_handlers) 16380 1727204141.01779: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204141.01784: getting variables 16380 1727204141.01786: in VariableManager get_vars() 16380 1727204141.01815: Calling all_inventory to load vars for managed-node2 16380 1727204141.01819: Calling groups_inventory to load vars for managed-node2 16380 1727204141.01823: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204141.01843: Calling all_plugins_play to load vars for managed-node2 16380 1727204141.01847: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204141.01851: Calling groups_plugins_play to load vars for managed-node2 16380 1727204141.02298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.02572: done with get_vars() 16380 1727204141.02585: done getting variables 16380 1727204141.02671: in VariableManager get_vars() 16380 1727204141.02683: Calling all_inventory to load vars for managed-node2 16380 1727204141.02686: Calling groups_inventory to load vars for managed-node2 16380 1727204141.02692: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204141.02698: Calling all_plugins_play to load vars for managed-node2 16380 1727204141.02709: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204141.02714: Calling groups_plugins_play to load vars for managed-node2 16380 1727204141.02912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.03188: done with get_vars() 16380 1727204141.03206: done queuing things up, now waiting for results queue to drain 16380 1727204141.03209: results queue empty 16380 1727204141.03210: checking for any_errors_fatal 16380 1727204141.03213: done checking for any_errors_fatal 16380 1727204141.03214: checking for max_fail_percentage 16380 1727204141.03215: done checking for max_fail_percentage 16380 1727204141.03216: checking to see if all hosts have failed and the running result is not ok 16380 1727204141.03217: done checking to see if all hosts have failed 16380 1727204141.03218: getting the remaining hosts for this loop 16380 1727204141.03224: done getting the remaining hosts for this loop 16380 1727204141.03227: getting the next task for host managed-node2 16380 1727204141.03233: done getting next task for host managed-node2 16380 1727204141.03235: ^ task is: TASK: Include the task 'el_repo_setup.yml' 16380 1727204141.03237: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204141.03240: getting variables 16380 1727204141.03241: in VariableManager get_vars() 16380 1727204141.03258: Calling all_inventory to load vars for managed-node2 16380 1727204141.03261: Calling groups_inventory to load vars for managed-node2 16380 1727204141.03265: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204141.03271: Calling all_plugins_play to load vars for managed-node2 16380 1727204141.03274: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204141.03278: Calling groups_plugins_play to load vars for managed-node2 16380 1727204141.03507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.03775: done with get_vars() 16380 1727204141.03785: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Tuesday 24 September 2024 14:55:41 -0400 (0:00:02.114) 0:00:02.145 ***** 16380 1727204141.03873: entering _queue_task() for managed-node2/include_tasks 16380 1727204141.03875: Creating lock for include_tasks 16380 1727204141.04278: worker is 1 (out of 1 available) 16380 1727204141.04346: exiting _queue_task() for managed-node2/include_tasks 16380 1727204141.04357: done queuing things up, now waiting for results queue to drain 16380 1727204141.04359: waiting for pending results... 16380 1727204141.04578: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 16380 1727204141.04676: in run() - task 12b410aa-8751-749c-b6eb-000000000006 16380 1727204141.04700: variable 'ansible_search_path' from source: unknown 16380 1727204141.04744: calling self._execute() 16380 1727204141.04838: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204141.04852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204141.04868: variable 'omit' from source: magic vars 16380 1727204141.05010: _execute() done 16380 1727204141.05020: dumping result to json 16380 1727204141.05030: done dumping result, returning 16380 1727204141.05094: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-749c-b6eb-000000000006] 16380 1727204141.05097: sending task result for task 12b410aa-8751-749c-b6eb-000000000006 16380 1727204141.05395: done sending task result for task 12b410aa-8751-749c-b6eb-000000000006 16380 1727204141.05398: WORKER PROCESS EXITING 16380 1727204141.05441: no more pending results, returning what we have 16380 1727204141.05446: in VariableManager get_vars() 16380 1727204141.05473: Calling all_inventory to load vars for managed-node2 16380 1727204141.05477: Calling groups_inventory to load vars for managed-node2 16380 1727204141.05480: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204141.05493: Calling all_plugins_play to load vars for managed-node2 16380 1727204141.05496: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204141.05501: Calling groups_plugins_play to load vars for managed-node2 16380 1727204141.05745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.06037: done with get_vars() 16380 1727204141.06045: variable 'ansible_search_path' from source: unknown 16380 1727204141.06060: we have included files to process 16380 1727204141.06062: generating all_blocks data 16380 1727204141.06063: done generating all_blocks data 16380 1727204141.06064: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 16380 1727204141.06066: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 16380 1727204141.06069: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 16380 1727204141.06957: in VariableManager get_vars() 16380 1727204141.06976: done with get_vars() 16380 1727204141.06992: done processing included file 16380 1727204141.06994: iterating over new_blocks loaded from include file 16380 1727204141.06996: in VariableManager get_vars() 16380 1727204141.07007: done with get_vars() 16380 1727204141.07015: filtering new block on tags 16380 1727204141.07034: done filtering new block on tags 16380 1727204141.07037: in VariableManager get_vars() 16380 1727204141.07050: done with get_vars() 16380 1727204141.07051: filtering new block on tags 16380 1727204141.07071: done filtering new block on tags 16380 1727204141.07074: in VariableManager get_vars() 16380 1727204141.07085: done with get_vars() 16380 1727204141.07086: filtering new block on tags 16380 1727204141.07104: done filtering new block on tags 16380 1727204141.07106: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 16380 1727204141.07113: extending task lists for all hosts with included blocks 16380 1727204141.07182: done extending task lists 16380 1727204141.07183: done processing included files 16380 1727204141.07184: results queue empty 16380 1727204141.07185: checking for any_errors_fatal 16380 1727204141.07187: done checking for any_errors_fatal 16380 1727204141.07188: checking for max_fail_percentage 16380 1727204141.07191: done checking for max_fail_percentage 16380 1727204141.07192: checking to see if all hosts have failed and the running result is not ok 16380 1727204141.07193: done checking to see if all hosts have failed 16380 1727204141.07194: getting the remaining hosts for this loop 16380 1727204141.07195: done getting the remaining hosts for this loop 16380 1727204141.07198: getting the next task for host managed-node2 16380 1727204141.07202: done getting next task for host managed-node2 16380 1727204141.07205: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 16380 1727204141.07208: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204141.07210: getting variables 16380 1727204141.07211: in VariableManager get_vars() 16380 1727204141.07221: Calling all_inventory to load vars for managed-node2 16380 1727204141.07223: Calling groups_inventory to load vars for managed-node2 16380 1727204141.07233: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204141.07239: Calling all_plugins_play to load vars for managed-node2 16380 1727204141.07242: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204141.07246: Calling groups_plugins_play to load vars for managed-node2 16380 1727204141.07468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.07753: done with get_vars() 16380 1727204141.07763: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.039) 0:00:02.185 ***** 16380 1727204141.07847: entering _queue_task() for managed-node2/setup 16380 1727204141.08236: worker is 1 (out of 1 available) 16380 1727204141.08248: exiting _queue_task() for managed-node2/setup 16380 1727204141.08258: done queuing things up, now waiting for results queue to drain 16380 1727204141.08260: waiting for pending results... 16380 1727204141.08406: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 16380 1727204141.08522: in run() - task 12b410aa-8751-749c-b6eb-00000000008f 16380 1727204141.08593: variable 'ansible_search_path' from source: unknown 16380 1727204141.08598: variable 'ansible_search_path' from source: unknown 16380 1727204141.08601: calling self._execute() 16380 1727204141.08682: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204141.08701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204141.08717: variable 'omit' from source: magic vars 16380 1727204141.09397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204141.12017: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204141.12182: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204141.12213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204141.12272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204141.12314: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204141.12475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204141.12479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204141.12510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204141.12567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204141.12595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204141.12822: variable 'ansible_facts' from source: unknown 16380 1727204141.12926: variable 'network_test_required_facts' from source: task vars 16380 1727204141.12978: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 16380 1727204141.12993: variable 'omit' from source: magic vars 16380 1727204141.13095: variable 'omit' from source: magic vars 16380 1727204141.13127: variable 'omit' from source: magic vars 16380 1727204141.13170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204141.13209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204141.13271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204141.13274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204141.13293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204141.13332: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204141.13346: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204141.13355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204141.13560: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204141.13564: Set connection var ansible_shell_executable to /bin/sh 16380 1727204141.13566: Set connection var ansible_connection to ssh 16380 1727204141.13568: Set connection var ansible_shell_type to sh 16380 1727204141.13571: Set connection var ansible_pipelining to False 16380 1727204141.13573: Set connection var ansible_timeout to 10 16380 1727204141.13601: variable 'ansible_shell_executable' from source: unknown 16380 1727204141.13611: variable 'ansible_connection' from source: unknown 16380 1727204141.13619: variable 'ansible_module_compression' from source: unknown 16380 1727204141.13626: variable 'ansible_shell_type' from source: unknown 16380 1727204141.13633: variable 'ansible_shell_executable' from source: unknown 16380 1727204141.13641: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204141.13649: variable 'ansible_pipelining' from source: unknown 16380 1727204141.13656: variable 'ansible_timeout' from source: unknown 16380 1727204141.13670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204141.13853: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204141.13887: variable 'omit' from source: magic vars 16380 1727204141.13890: starting attempt loop 16380 1727204141.13895: running the handler 16380 1727204141.13926: _low_level_execute_command(): starting 16380 1727204141.13998: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204141.14662: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204141.14706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204141.14723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204141.14747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204141.14811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204141.14905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204141.14918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204141.15217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16380 1727204141.17714: stdout chunk (state=3): >>>/root <<< 16380 1727204141.17833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204141.17908: stderr chunk (state=3): >>><<< 16380 1727204141.17937: stdout chunk (state=3): >>><<< 16380 1727204141.18059: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16380 1727204141.18092: _low_level_execute_command(): starting 16380 1727204141.18107: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626 `" && echo ansible-tmp-1727204141.1807482-16538-86196282936626="` echo /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626 `" ) && sleep 0' 16380 1727204141.19566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204141.19715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204141.19733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204141.19805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16380 1727204141.22947: stdout chunk (state=3): >>>ansible-tmp-1727204141.1807482-16538-86196282936626=/root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626 <<< 16380 1727204141.22951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204141.22953: stdout chunk (state=3): >>><<< 16380 1727204141.22956: stderr chunk (state=3): >>><<< 16380 1727204141.23164: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204141.1807482-16538-86196282936626=/root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16380 1727204141.23582: variable 'ansible_module_compression' from source: unknown 16380 1727204141.23610: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204141.23681: variable 'ansible_facts' from source: unknown 16380 1727204141.24174: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py 16380 1727204141.24528: Sending initial data 16380 1727204141.24568: Sent initial data (153 bytes) 16380 1727204141.25735: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204141.25770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204141.26074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204141.26114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204141.27870: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 16380 1727204141.27997: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204141.28021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204141.28056: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpuw628dw6 /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py <<< 16380 1727204141.28311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py" <<< 16380 1727204141.28315: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpuw628dw6" to remote "/root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py" <<< 16380 1727204141.32955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204141.32997: stderr chunk (state=3): >>><<< 16380 1727204141.33096: stdout chunk (state=3): >>><<< 16380 1727204141.33099: done transferring module to remote 16380 1727204141.33216: _low_level_execute_command(): starting 16380 1727204141.33221: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/ /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py && sleep 0' 16380 1727204141.34486: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204141.34504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204141.34525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204141.34549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204141.34637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204141.34640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204141.34712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204141.34770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204141.34805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204141.36956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204141.36960: stdout chunk (state=3): >>><<< 16380 1727204141.36963: stderr chunk (state=3): >>><<< 16380 1727204141.36966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204141.36968: _low_level_execute_command(): starting 16380 1727204141.36971: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/AnsiballZ_setup.py && sleep 0' 16380 1727204141.38911: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204141.39134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204141.39214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204141.39608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204141.41502: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 16380 1727204141.41528: stdout chunk (state=3): >>>import _imp # builtin <<< 16380 1727204141.41568: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 16380 1727204141.41585: stdout chunk (state=3): >>>import '_weakref' # <<< 16380 1727204141.41636: stdout chunk (state=3): >>>import '_io' # <<< 16380 1727204141.41711: stdout chunk (state=3): >>>import 'marshal' # <<< 16380 1727204141.41744: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # <<< 16380 1727204141.41771: stdout chunk (state=3): >>># installed zipimport hook <<< 16380 1727204141.41804: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 16380 1727204141.41884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 16380 1727204141.41924: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4940c4d0> <<< 16380 1727204141.41966: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d493dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4940ea20> <<< 16380 1727204141.42087: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 16380 1727204141.42117: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 16380 1727204141.42169: stdout chunk (state=3): >>>import '_collections_abc' # <<< 16380 1727204141.42228: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 16380 1727204141.42433: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 16380 1727204141.42453: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49221fd0> <<< 16380 1727204141.42477: stdout chunk (state=3): >>>import 'site' # <<< 16380 1727204141.42500: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 16380 1727204141.42897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 16380 1727204141.42927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 16380 1727204141.42958: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.42981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 16380 1727204141.43028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 16380 1727204141.43043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 16380 1727204141.43104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 16380 1727204141.43149: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925ff80> <<< 16380 1727204141.43152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 16380 1727204141.43216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 16380 1727204141.43220: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 16380 1727204141.43263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.43413: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492978c0> <<< 16380 1727204141.43417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49297f50> <<< 16380 1727204141.43451: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49277b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492752b0> <<< 16380 1727204141.43569: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925d070> <<< 16380 1727204141.43656: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 16380 1727204141.43683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 16380 1727204141.43778: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 16380 1727204141.43782: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492bb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ba4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492b8bc0> <<< 16380 1727204141.43836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ec800> <<< 16380 1727204141.44023: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d492eccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ecb60> <<< 16380 1727204141.44031: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d492ecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 16380 1727204141.44065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ed610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ed2e0> import 'importlib.machinery' # <<< 16380 1727204141.44185: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 16380 1727204141.44191: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ee510> import 'importlib.util' # <<< 16380 1727204141.44231: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49308740> <<< 16380 1727204141.44397: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d49309e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4930ad80> <<< 16380 1727204141.44418: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4930b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4930a2d0> <<< 16380 1727204141.44442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 16380 1727204141.44487: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4930be30> <<< 16380 1727204141.44771: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4930b560> <<< 16380 1727204141.44776: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ee570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4905fd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d490887d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49088530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d49088800> <<< 16380 1727204141.44813: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d490889e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4905dee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 16380 1727204141.44937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 16380 1727204141.44970: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4908a000> <<< 16380 1727204141.45000: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49088c80> <<< 16380 1727204141.45045: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492eec60> <<< 16380 1727204141.45048: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 16380 1727204141.45117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.45120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 16380 1727204141.45213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 16380 1727204141.45216: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490b6390> <<< 16380 1727204141.45255: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 16380 1727204141.45266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.45347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 16380 1727204141.45365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 16380 1727204141.45368: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490ce540> <<< 16380 1727204141.45380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 16380 1727204141.45437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 16380 1727204141.45975: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d491072f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4912da90> <<< 16380 1727204141.45979: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49107410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490cf1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f20440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490cd580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4908af30> <<< 16380 1727204141.46059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 16380 1727204141.46081: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7d48f206e0> <<< 16380 1727204141.46337: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_uauoyg28/ansible_setup_payload.zip' # zipimport: zlib available <<< 16380 1727204141.46418: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.46454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 16380 1727204141.46504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 16380 1727204141.46582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 16380 1727204141.46624: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f8e120> <<< 16380 1727204141.46649: stdout chunk (state=3): >>>import '_typing' # <<< 16380 1727204141.46925: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f650a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f64200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 16380 1727204141.46939: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.48502: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.49805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f67e60> <<< 16380 1727204141.49871: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 16380 1727204141.49914: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 16380 1727204141.50098: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48fbdb80> <<< 16380 1727204141.50138: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbd910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbd220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbdc70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f8ee40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48fbe930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48fbeb70> <<< 16380 1727204141.50150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 16380 1727204141.50207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 16380 1727204141.50259: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbef90> <<< 16380 1727204141.50281: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 16380 1727204141.50324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 16380 1727204141.50352: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e24da0> <<< 16380 1727204141.50427: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204141.50430: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e269c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 16380 1727204141.50550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 16380 1727204141.50616: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e272f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 16380 1727204141.50619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e284d0> <<< 16380 1727204141.50621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 16380 1727204141.50624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 16380 1727204141.50626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 16380 1727204141.50726: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2af90> <<< 16380 1727204141.50729: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e2b2f0> <<< 16380 1727204141.50786: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e29250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 16380 1727204141.50794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 16380 1727204141.50941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 16380 1727204141.50944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2ef90> import '_tokenize' # <<< 16380 1727204141.51112: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2da60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2d7c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2fb30> <<< 16380 1727204141.51181: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e296d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e73110> <<< 16380 1727204141.51307: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e73320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 16380 1727204141.51434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e78e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e78bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 16380 1727204141.51493: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e7b350> <<< 16380 1727204141.51536: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e794f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 16380 1727204141.51569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.51928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 16380 1727204141.51932: stdout chunk (state=3): >>>import '_string' # <<< 16380 1727204141.51947: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e82b70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e7b500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e83920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e83c20> <<< 16380 1727204141.51986: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204141.52014: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e83c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e73500> <<< 16380 1727204141.52049: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 16380 1727204141.52160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e86ab0> <<< 16380 1727204141.52342: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e880e0> <<< 16380 1727204141.52377: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e85220> <<< 16380 1727204141.52405: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e86600> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e84e00> # zipimport: zlib available <<< 16380 1727204141.52444: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 16380 1727204141.52456: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.52585: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.52664: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.52729: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 16380 1727204141.52906: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.53112: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.53706: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.54400: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 16380 1727204141.54420: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 16380 1727204141.54445: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 16380 1727204141.54464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.54646: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d103b0> <<< 16380 1727204141.54650: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 16380 1727204141.54653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d111f0> <<< 16380 1727204141.54742: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e8ac00> <<< 16380 1727204141.54766: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 16380 1727204141.54771: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.55094: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 16380 1727204141.55130: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.55160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d10f50> <<< 16380 1727204141.55179: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.55883: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56284: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56367: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56463: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 16380 1727204141.56478: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56516: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56558: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 16380 1727204141.56570: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56659: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56784: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 16380 1727204141.56813: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 16380 1727204141.56854: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.56912: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 16380 1727204141.56915: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.57187: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.57467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 16380 1727204141.57543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 16380 1727204141.57561: stdout chunk (state=3): >>>import '_ast' # <<< 16380 1727204141.57661: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d13530> <<< 16380 1727204141.57752: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.57812: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 16380 1727204141.57861: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 16380 1727204141.57893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 16380 1727204141.58039: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204141.58335: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d19cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d1a600> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e8b4d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 16380 1727204141.58352: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.58465: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.58526: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.58771: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204141.58843: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204141.58913: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d19400> <<< 16380 1727204141.58954: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d1a780> <<< 16380 1727204141.59061: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 16380 1727204141.59090: stdout chunk (state=3): >>> import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 16380 1727204141.59170: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.59262: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.59306: stdout chunk (state=3): >>> <<< 16380 1727204141.59437: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 16380 1727204141.59478: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 16380 1727204141.59516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 16380 1727204141.59644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 16380 1727204141.59773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48dae990> <<< 16380 1727204141.59860: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d245c0> <<< 16380 1727204141.60002: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d22750> <<< 16380 1727204141.60054: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d225a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 16380 1727204141.60058: stdout chunk (state=3): >>> # zipimport: zlib available <<< 16380 1727204141.60136: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.60175: stdout chunk (state=3): >>> import 'ansible.module_utils.common._utils' # <<< 16380 1727204141.60179: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 16380 1727204141.60348: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 16380 1727204141.60351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 16380 1727204141.60450: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.60562: stdout chunk (state=3): >>> # zipimport: zlib available<<< 16380 1727204141.60990: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.60994: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 16380 1727204141.61018: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.61111: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.61148: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.61206: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 16380 1727204141.61859: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.62261: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db5280> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 16380 1727204141.62694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830c0e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4830c410> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d89130> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d88440> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db7110> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db6d20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 16380 1727204141.63059: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4830f4d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830ed80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4830ef60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830e1b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830f5c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4836a0f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48368110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db4a10> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.63079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.63140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 16380 1727204141.63200: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.63263: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 16380 1727204141.63311: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.63332: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.63388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 16380 1727204141.63457: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.63543: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.63588: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.63672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 16380 1727204141.63676: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.64264: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.64736: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 16380 1727204141.64805: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.64853: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.64884: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.64954: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 16380 1727204141.64959: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.64983: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.65233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 16380 1727204141.65392: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.65396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 16380 1727204141.65514: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.65564: stdout chunk (state=3): >>> <<< 16380 1727204141.65713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 16380 1727204141.65755: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4836bf50><<< 16380 1727204141.65821: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 16380 1727204141.65858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc'<<< 16380 1727204141.66124: stdout chunk (state=3): >>> import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4836aed0><<< 16380 1727204141.66153: stdout chunk (state=3): >>> <<< 16380 1727204141.66341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.66402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 16380 1727204141.66423: stdout chunk (state=3): >>> <<< 16380 1727204141.66449: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.66689: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.66794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 16380 1727204141.66800: stdout chunk (state=3): >>> <<< 16380 1727204141.66828: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.66835: stdout chunk (state=3): >>> <<< 16380 1727204141.67020: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.67026: stdout chunk (state=3): >>> <<< 16380 1727204141.67088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 16380 1727204141.67101: stdout chunk (state=3): >>> <<< 16380 1727204141.67305: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 16380 1727204141.67394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc'<<< 16380 1727204141.67404: stdout chunk (state=3): >>> <<< 16380 1727204141.67583: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 16380 1727204141.67641: stdout chunk (state=3): >>> # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 16380 1727204141.67645: stdout chunk (state=3): >>> <<< 16380 1727204141.67648: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d483aa480> <<< 16380 1727204141.68013: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d483922a0> import 'ansible.module_utils.facts.system.python' # <<< 16380 1727204141.68034: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.68074: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.68129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 16380 1727204141.68215: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.68315: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.68328: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.68447: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.68907: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4819df40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4819de80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 16380 1727204141.68971: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 16380 1727204141.68988: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 16380 1727204141.69047: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69218: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69395: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 16380 1727204141.69422: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69517: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69624: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69665: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.69768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 16380 1727204141.69782: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.69991: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.70474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.70558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 16380 1727204141.70618: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.70683: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.70697: stdout chunk (state=3): >>> <<< 16380 1727204141.71768: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.72807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 16380 1727204141.72820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 16380 1727204141.72891: stdout chunk (state=3): >>> <<< 16380 1727204141.72912: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.73167: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.73271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 16380 1727204141.73295: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.73471: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.73493: stdout chunk (state=3): >>> <<< 16380 1727204141.73675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 16380 1727204141.73702: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.73721: stdout chunk (state=3): >>> <<< 16380 1727204141.74001: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.74026: stdout chunk (state=3): >>> <<< 16380 1727204141.74299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 16380 1727204141.74332: stdout chunk (state=3): >>> # zipimport: zlib available<<< 16380 1727204141.74446: stdout chunk (state=3): >>> # zipimport: zlib available <<< 16380 1727204141.74450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 16380 1727204141.74551: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 16380 1727204141.74585: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.74602: stdout chunk (state=3): >>> <<< 16380 1727204141.74777: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.74955: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.74980: stdout chunk (state=3): >>> <<< 16380 1727204141.75567: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.75768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 16380 1727204141.75869: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 16380 1727204141.75876: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.76002: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.76036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 16380 1727204141.76051: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.76115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 16380 1727204141.76217: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.76248: stdout chunk (state=3): >>> <<< 16380 1727204141.76270: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.76386: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 16380 1727204141.76412: stdout chunk (state=3): >>> <<< 16380 1727204141.76588: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 16380 1727204141.76829: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.76832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.76917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 16380 1727204141.76955: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.77487: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.77504: stdout chunk (state=3): >>> <<< 16380 1727204141.77984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 16380 1727204141.78016: stdout chunk (state=3): >>> # zipimport: zlib available<<< 16380 1727204141.78067: stdout chunk (state=3): >>> <<< 16380 1727204141.78140: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.78163: stdout chunk (state=3): >>> <<< 16380 1727204141.78364: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 16380 1727204141.78398: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204141.78447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 16380 1727204141.78501: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.78558: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.netbsd' # <<< 16380 1727204141.78611: stdout chunk (state=3): >>> # zipimport: zlib available <<< 16380 1727204141.78653: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.78717: stdout chunk (state=3): >>> <<< 16380 1727204141.78765: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available<<< 16380 1727204141.78768: stdout chunk (state=3): >>> <<< 16380 1727204141.78917: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.79083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available<<< 16380 1727204141.79089: stdout chunk (state=3): >>> <<< 16380 1727204141.79141: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 16380 1727204141.79280: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # <<< 16380 1727204141.79316: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.79362: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.79405: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.79492: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.79567: stdout chunk (state=3): >>> <<< 16380 1727204141.79606: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.79738: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.79887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 16380 1727204141.79913: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 16380 1727204141.79953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 16380 1727204141.80122: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 16380 1727204141.80145: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.80788: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.80958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 16380 1727204141.80999: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.81069: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204141.81132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 16380 1727204141.81167: stdout chunk (state=3): >>> # zipimport: zlib available <<< 16380 1727204141.81248: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.81428: stdout chunk (state=3): >>> <<< 16380 1727204141.81432: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available<<< 16380 1727204141.81469: stdout chunk (state=3): >>> <<< 16380 1727204141.81514: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.81551: stdout chunk (state=3): >>> <<< 16380 1727204141.81734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 16380 1727204141.81887: stdout chunk (state=3): >>># zipimport: zlib available<<< 16380 1727204141.81937: stdout chunk (state=3): >>> <<< 16380 1727204141.82087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # <<< 16380 1727204141.82241: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 16380 1727204141.83139: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py<<< 16380 1727204141.83171: stdout chunk (state=3): >>> <<< 16380 1727204141.83187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc'<<< 16380 1727204141.83236: stdout chunk (state=3): >>> <<< 16380 1727204141.83257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 16380 1727204141.83280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 16380 1727204141.83341: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 16380 1727204141.83360: stdout chunk (state=3): >>> # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 16380 1727204141.83396: stdout chunk (state=3): >>> import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d481c7770> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d481c5f10> <<< 16380 1727204141.83502: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d481c7410><<< 16380 1727204141.83585: stdout chunk (state=3): >>> <<< 16380 1727204141.84835: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "41", "epoch": "1727204141", "epoch_int": "1727204141", "date": "2024-09-24", "time": "14:55:41", "iso8601_micro": "2024-09-24T18:55:41.830281Z", "iso8601": "2024-09-24T18:55:41Z", "iso8601_basic": "20240924T145541830281", "iso8601_basic_short": "20240924T145541", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}<<< 16380 1727204141.84892: stdout chunk (state=3): >>> <<< 16380 1727204141.85561: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 16380 1727204141.85599: stdout chunk (state=3): >>># clear sys.path_hooks<<< 16380 1727204141.85628: stdout chunk (state=3): >>> # clear builtins._ <<< 16380 1727204141.85728: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 16380 1727204141.85920: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin<<< 16380 1727204141.85924: stdout chunk (state=3): >>> # restore sys.stdout<<< 16380 1727204141.85937: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport<<< 16380 1727204141.86034: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder <<< 16380 1727204141.86102: stdout chunk (state=3): >>># cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess <<< 16380 1727204141.86495: stdout chunk (state=3): >>># cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.pro<<< 16380 1727204141.86529: stdout chunk (state=3): >>>cess # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 16380 1727204141.86852: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 16380 1727204141.86880: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 16380 1727204141.86913: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 16380 1727204141.86950: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 16380 1727204141.86983: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 16380 1727204141.87002: stdout chunk (state=3): >>># destroy ipaddress <<< 16380 1727204141.87061: stdout chunk (state=3): >>># destroy ntpath <<< 16380 1727204141.87064: stdout chunk (state=3): >>># destroy importlib <<< 16380 1727204141.87067: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal <<< 16380 1727204141.87107: stdout chunk (state=3): >>># destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 16380 1727204141.87111: stdout chunk (state=3): >>># destroy _json <<< 16380 1727204141.87134: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale <<< 16380 1727204141.87163: stdout chunk (state=3): >>># destroy locale <<< 16380 1727204141.87166: stdout chunk (state=3): >>># destroy select # destroy _signal <<< 16380 1727204141.87270: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 16380 1727204141.87338: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing <<< 16380 1727204141.87342: stdout chunk (state=3): >>># destroy multiprocessing.connection<<< 16380 1727204141.87344: stdout chunk (state=3): >>> # destroy multiprocessing.pool # destroy signal <<< 16380 1727204141.87383: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 16380 1727204141.87388: stdout chunk (state=3): >>># destroy _pickle <<< 16380 1727204141.87404: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 16380 1727204141.87422: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata <<< 16380 1727204141.87439: stdout chunk (state=3): >>># destroy tempfile <<< 16380 1727204141.87464: stdout chunk (state=3): >>># destroy multiprocessing.util # destroy multiprocessing.reduction <<< 16380 1727204141.87583: stdout chunk (state=3): >>># destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 16380 1727204141.87586: stdout chunk (state=3): >>># destroy errno # destroy json <<< 16380 1727204141.87628: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 16380 1727204141.87634: stdout chunk (state=3): >>># destroy glob <<< 16380 1727204141.87655: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 16380 1727204141.87725: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 16380 1727204141.87750: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 16380 1727204141.87761: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 16380 1727204141.87786: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 16380 1727204141.87815: stdout chunk (state=3): >>># destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib <<< 16380 1727204141.87838: stdout chunk (state=3): >>># cleanup[3] wiping threading <<< 16380 1727204141.87887: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 16380 1727204141.87903: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 16380 1727204141.87937: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 16380 1727204141.88183: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 16380 1727204141.88516: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 16380 1727204141.88678: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 16380 1727204141.88682: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 16380 1727204141.88685: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 16380 1727204141.88687: stdout chunk (state=3): >>># destroy atexit <<< 16380 1727204141.88694: stdout chunk (state=3): >>># destroy _warnings <<< 16380 1727204141.88697: stdout chunk (state=3): >>># destroy math # destroy _bisect <<< 16380 1727204141.88720: stdout chunk (state=3): >>># destroy time <<< 16380 1727204141.88736: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 16380 1727204141.88766: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string <<< 16380 1727204141.88847: stdout chunk (state=3): >>># destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 16380 1727204141.88966: stdout chunk (state=3): >>># clear sys.audit hooks <<< 16380 1727204141.89685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204141.89688: stdout chunk (state=3): >>><<< 16380 1727204141.89694: stderr chunk (state=3): >>><<< 16380 1727204141.89744: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4940c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d493dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4940ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49221fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492978c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49297f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49277b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492752b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492bb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ba4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492b8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d492eccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ecb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d492ecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4925ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ed610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ed2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ee510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49308740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d49309e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4930ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4930b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4930a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4930be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4930b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492ee570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4905fd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d490887d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49088530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d49088800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d490889e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4905dee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4908a000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49088c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d492eec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490b6390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490ce540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d491072f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4912da90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d49107410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490cf1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f20440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d490cd580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4908af30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7d48f206e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_uauoyg28/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f8e120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f650a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f64200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f67e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48fbdb80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbd910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbd220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbdc70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48f8ee40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48fbe930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48fbeb70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48fbef90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e24da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e269c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e272f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e284d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2af90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e2b2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e29250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2ef90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2da60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2d7c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e2fb30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e296d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e73110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e73320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e78e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e78bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e7b350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e794f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e82b70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e7b500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e83920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e83c20> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e83c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e73500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e86ab0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e880e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e85220> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48e86600> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e84e00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d103b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d111f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e8ac00> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d10f50> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d13530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d19cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d1a600> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48e8b4d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d48d19400> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d1a780> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48dae990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d245c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d22750> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d225a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db5280> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830c0e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4830c410> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d89130> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48d88440> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db7110> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db6d20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4830f4d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830ed80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4830ef60> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830e1b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4830f5c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4836a0f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48368110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d48db4a10> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4836bf50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4836aed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d483aa480> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d483922a0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d4819df40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d4819de80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d481c7770> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d481c5f10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d481c7410> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "41", "epoch": "1727204141", "epoch_int": "1727204141", "date": "2024-09-24", "time": "14:55:41", "iso8601_micro": "2024-09-24T18:55:41.830281Z", "iso8601": "2024-09-24T18:55:41Z", "iso8601_basic": "20240924T145541830281", "iso8601_basic_short": "20240924T145541", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 16380 1727204141.91928: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204141.91932: _low_level_execute_command(): starting 16380 1727204141.91935: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204141.1807482-16538-86196282936626/ > /dev/null 2>&1 && sleep 0' 16380 1727204141.91937: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204141.91955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204141.91973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204141.91996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204141.92044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204141.92148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204141.92169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204141.92232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204141.95203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204141.95216: stdout chunk (state=3): >>><<< 16380 1727204141.95230: stderr chunk (state=3): >>><<< 16380 1727204141.95254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204141.95268: handler run complete 16380 1727204141.95339: variable 'ansible_facts' from source: unknown 16380 1727204141.95417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.95597: variable 'ansible_facts' from source: unknown 16380 1727204141.95809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.95812: attempt loop complete, returning result 16380 1727204141.95815: _execute() done 16380 1727204141.95817: dumping result to json 16380 1727204141.95819: done dumping result, returning 16380 1727204141.95822: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-749c-b6eb-00000000008f] 16380 1727204141.95839: sending task result for task 12b410aa-8751-749c-b6eb-00000000008f 16380 1727204141.96148: done sending task result for task 12b410aa-8751-749c-b6eb-00000000008f 16380 1727204141.96151: WORKER PROCESS EXITING ok: [managed-node2] 16380 1727204141.96573: no more pending results, returning what we have 16380 1727204141.96576: results queue empty 16380 1727204141.96577: checking for any_errors_fatal 16380 1727204141.96579: done checking for any_errors_fatal 16380 1727204141.96580: checking for max_fail_percentage 16380 1727204141.96582: done checking for max_fail_percentage 16380 1727204141.96583: checking to see if all hosts have failed and the running result is not ok 16380 1727204141.96584: done checking to see if all hosts have failed 16380 1727204141.96585: getting the remaining hosts for this loop 16380 1727204141.96587: done getting the remaining hosts for this loop 16380 1727204141.96616: getting the next task for host managed-node2 16380 1727204141.96626: done getting next task for host managed-node2 16380 1727204141.96629: ^ task is: TASK: Check if system is ostree 16380 1727204141.96632: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204141.96636: getting variables 16380 1727204141.96637: in VariableManager get_vars() 16380 1727204141.96666: Calling all_inventory to load vars for managed-node2 16380 1727204141.96669: Calling groups_inventory to load vars for managed-node2 16380 1727204141.96673: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204141.96685: Calling all_plugins_play to load vars for managed-node2 16380 1727204141.96688: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204141.96695: Calling groups_plugins_play to load vars for managed-node2 16380 1727204141.96986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204141.97273: done with get_vars() 16380 1727204141.97294: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.895) 0:00:03.080 ***** 16380 1727204141.97422: entering _queue_task() for managed-node2/stat 16380 1727204141.97758: worker is 1 (out of 1 available) 16380 1727204141.97771: exiting _queue_task() for managed-node2/stat 16380 1727204141.97784: done queuing things up, now waiting for results queue to drain 16380 1727204141.97786: waiting for pending results... 16380 1727204141.98056: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 16380 1727204141.98333: in run() - task 12b410aa-8751-749c-b6eb-000000000091 16380 1727204141.98440: variable 'ansible_search_path' from source: unknown 16380 1727204141.98444: variable 'ansible_search_path' from source: unknown 16380 1727204141.98446: calling self._execute() 16380 1727204141.98484: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204141.98498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204141.98513: variable 'omit' from source: magic vars 16380 1727204141.99316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204141.99681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204141.99740: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204141.99791: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204141.99856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204141.99961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204142.00004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204142.00040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204142.00081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204142.00226: Evaluated conditional (not __network_is_ostree is defined): True 16380 1727204142.00238: variable 'omit' from source: magic vars 16380 1727204142.00288: variable 'omit' from source: magic vars 16380 1727204142.00345: variable 'omit' from source: magic vars 16380 1727204142.00407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204142.00422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204142.00448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204142.00474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204142.00492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204142.00583: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204142.00587: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204142.00593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204142.00732: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204142.00735: Set connection var ansible_shell_executable to /bin/sh 16380 1727204142.00738: Set connection var ansible_connection to ssh 16380 1727204142.00741: Set connection var ansible_shell_type to sh 16380 1727204142.00754: Set connection var ansible_pipelining to False 16380 1727204142.00769: Set connection var ansible_timeout to 10 16380 1727204142.00840: variable 'ansible_shell_executable' from source: unknown 16380 1727204142.00844: variable 'ansible_connection' from source: unknown 16380 1727204142.00847: variable 'ansible_module_compression' from source: unknown 16380 1727204142.00849: variable 'ansible_shell_type' from source: unknown 16380 1727204142.00851: variable 'ansible_shell_executable' from source: unknown 16380 1727204142.00854: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204142.00856: variable 'ansible_pipelining' from source: unknown 16380 1727204142.00870: variable 'ansible_timeout' from source: unknown 16380 1727204142.00949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204142.01079: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204142.01099: variable 'omit' from source: magic vars 16380 1727204142.01110: starting attempt loop 16380 1727204142.01117: running the handler 16380 1727204142.01138: _low_level_execute_command(): starting 16380 1727204142.01152: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204142.01888: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204142.01905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204142.01924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204142.02050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204142.02065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204142.02080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204142.02165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 16380 1727204142.04357: stdout chunk (state=3): >>>/root <<< 16380 1727204142.04367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204142.04545: stderr chunk (state=3): >>><<< 16380 1727204142.04570: stdout chunk (state=3): >>><<< 16380 1727204142.04600: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 16380 1727204142.04726: _low_level_execute_command(): starting 16380 1727204142.04731: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087 `" && echo ansible-tmp-1727204142.0461764-16575-201094406905087="` echo /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087 `" ) && sleep 0' 16380 1727204142.05287: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204142.05306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204142.05324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204142.05406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204142.05443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204142.05463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204142.05482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204142.05553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204142.07664: stdout chunk (state=3): >>>ansible-tmp-1727204142.0461764-16575-201094406905087=/root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087 <<< 16380 1727204142.07938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204142.07996: stdout chunk (state=3): >>><<< 16380 1727204142.08000: stderr chunk (state=3): >>><<< 16380 1727204142.08003: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204142.0461764-16575-201094406905087=/root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204142.08078: variable 'ansible_module_compression' from source: unknown 16380 1727204142.08166: ANSIBALLZ: Using lock for stat 16380 1727204142.08170: ANSIBALLZ: Acquiring lock 16380 1727204142.08172: ANSIBALLZ: Lock acquired: 140602939598432 16380 1727204142.08203: ANSIBALLZ: Creating module 16380 1727204142.44800: ANSIBALLZ: Writing module into payload 16380 1727204142.44966: ANSIBALLZ: Writing module 16380 1727204142.45021: ANSIBALLZ: Renaming module 16380 1727204142.45208: ANSIBALLZ: Done creating module 16380 1727204142.45295: variable 'ansible_facts' from source: unknown 16380 1727204142.45328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py 16380 1727204142.45668: Sending initial data 16380 1727204142.45679: Sent initial data (153 bytes) 16380 1727204142.47017: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204142.47034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204142.47086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204142.47102: stderr chunk (state=3): >>>debug2: match found <<< 16380 1727204142.47121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204142.47215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204142.47243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204142.47318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204142.47335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204142.49104: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204142.49137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204142.49232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpwj1l2fvg /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py <<< 16380 1727204142.49235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py" <<< 16380 1727204142.49238: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpwj1l2fvg" to remote "/root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py" <<< 16380 1727204142.51006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204142.51094: stderr chunk (state=3): >>><<< 16380 1727204142.51104: stdout chunk (state=3): >>><<< 16380 1727204142.51450: done transferring module to remote 16380 1727204142.51458: _low_level_execute_command(): starting 16380 1727204142.51460: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/ /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py && sleep 0' 16380 1727204142.52542: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204142.52557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204142.52783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204142.52956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204142.52999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204142.55049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204142.55067: stderr chunk (state=3): >>><<< 16380 1727204142.55095: stdout chunk (state=3): >>><<< 16380 1727204142.55161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204142.55202: _low_level_execute_command(): starting 16380 1727204142.55218: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/AnsiballZ_stat.py && sleep 0' 16380 1727204142.56695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204142.56829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204142.56984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204142.57811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204142.59842: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 16380 1727204142.59882: stdout chunk (state=3): >>>import _imp # builtin <<< 16380 1727204142.59950: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 16380 1727204142.59983: stdout chunk (state=3): >>>import '_io' # <<< 16380 1727204142.60095: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 16380 1727204142.60126: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 16380 1727204142.60198: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204142.60220: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 16380 1727204142.60380: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 16380 1727204142.60385: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a12c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a0fbad0> <<< 16380 1727204142.60388: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 16380 1727204142.60423: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a12ea20> <<< 16380 1727204142.60435: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 16380 1727204142.60554: stdout chunk (state=3): >>>import '_collections_abc' # <<< 16380 1727204142.60557: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 16380 1727204142.60655: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 16380 1727204142.60706: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 16380 1727204142.60743: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f410a0> <<< 16380 1727204142.60796: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 16380 1727204142.60858: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f41fd0> import 'site' # <<< 16380 1727204142.60863: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 16380 1727204142.61143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 16380 1727204142.61162: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 16380 1727204142.61182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204142.61469: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 16380 1727204142.61618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fb78c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fb7f50> <<< 16380 1727204142.61698: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f97b60> import '_functools' # <<< 16380 1727204142.61795: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f952b0> <<< 16380 1727204142.61879: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7d070> <<< 16380 1727204142.62263: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 16380 1727204142.62270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fdb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fda4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fd8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.62513: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a00ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a00cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00e510> <<< 16380 1727204142.62517: stdout chunk (state=3): >>>import 'importlib.util' # <<< 16380 1727204142.62553: stdout chunk (state=3): >>>import 'runpy' # <<< 16380 1727204142.62560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 16380 1727204142.62634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 16380 1727204142.62650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a028740> <<< 16380 1727204142.62655: stdout chunk (state=3): >>>import 'errno' # <<< 16380 1727204142.62699: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a029e80> <<< 16380 1727204142.62732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 16380 1727204142.62763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 16380 1727204142.62846: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a02ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.62861: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a02b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a02a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 16380 1727204142.62919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 16380 1727204142.63015: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a02be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a02b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00e570> <<< 16380 1727204142.63025: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 16380 1727204142.63085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 16380 1727204142.63095: stdout chunk (state=3): >>> <<< 16380 1727204142.63131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 16380 1727204142.63202: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.63220: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.63265: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549debd40><<< 16380 1727204142.63316: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 16380 1727204142.63320: stdout chunk (state=3): >>> <<< 16380 1727204142.63322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 16380 1727204142.63348: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 16380 1727204142.63373: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549e14860><<< 16380 1727204142.63482: stdout chunk (state=3): >>> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e145c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549e14890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549e14a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549de9ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 16380 1727204142.63579: stdout chunk (state=3): >>> <<< 16380 1727204142.63911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e16180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e14e00><<< 16380 1727204142.63915: stdout chunk (state=3): >>> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00ec60> <<< 16380 1727204142.63971: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 16380 1727204142.64094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 16380 1727204142.64172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 16380 1727204142.64244: stdout chunk (state=3): >>> import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e3e510> <<< 16380 1727204142.64359: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 16380 1727204142.64394: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 16380 1727204142.64408: stdout chunk (state=3): >>> <<< 16380 1727204142.64444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 16380 1727204142.64531: stdout chunk (state=3): >>> import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e5a690> <<< 16380 1727204142.64659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 16380 1727204142.64697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 16380 1727204142.64801: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e8f410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 16380 1727204142.64839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 16380 1727204142.65033: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549eb9bb0> <<< 16380 1727204142.65154: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e8f530> <<< 16380 1727204142.65160: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e5b320> <<< 16380 1727204142.65167: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c944a0> <<< 16380 1727204142.65251: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e596d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e170b0> <<< 16380 1727204142.65287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 16380 1727204142.65595: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa549e597f0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_oaunjx1b/ansible_stat_payload.zip' # zipimport: zlib available <<< 16380 1727204142.65618: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 16380 1727204142.65898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cee0c0> import '_typing' # <<< 16380 1727204142.65951: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cc5040> <<< 16380 1727204142.65957: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cc41a0> # zipimport: zlib available <<< 16380 1727204142.66005: stdout chunk (state=3): >>>import 'ansible' # <<< 16380 1727204142.66013: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.66036: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204142.66046: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 16380 1727204142.66064: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.68132: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.70271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 16380 1727204142.70284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cc7fe0> <<< 16380 1727204142.70308: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 16380 1727204142.70339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 16380 1727204142.70352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 16380 1727204142.70379: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 16380 1727204142.70389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 16380 1727204142.70579: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549d19b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d19910> <<< 16380 1727204142.70582: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d19220> <<< 16380 1727204142.70646: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d19c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549ceeb70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549d1a930> <<< 16380 1727204142.70703: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549d1ab70> <<< 16380 1727204142.70906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d1b0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 16380 1727204142.70953: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b7ce90> <<< 16380 1727204142.71125: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549b7eab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 16380 1727204142.71132: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b7f3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 16380 1727204142.71155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 16380 1727204142.71233: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b80590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 16380 1727204142.71262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 16380 1727204142.71404: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b83080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549b831d0> <<< 16380 1727204142.71414: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b81340> <<< 16380 1727204142.71441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 16380 1727204142.71518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 16380 1727204142.71550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 16380 1727204142.71594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 16380 1727204142.71796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 16380 1727204142.71800: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b86ff0> import '_tokenize' # <<< 16380 1727204142.71825: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b85ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b85820> <<< 16380 1727204142.71990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b87f80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b81850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bcf110> <<< 16380 1727204142.72121: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bcf290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bd0e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bd0c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 16380 1727204142.72294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 16380 1727204142.72377: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bd33b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bd1550> <<< 16380 1727204142.72451: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204142.72498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 16380 1727204142.72514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 16380 1727204142.72587: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bdaba0> <<< 16380 1727204142.72841: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bd3530> <<< 16380 1727204142.72981: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdbe60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.73040: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdba10> <<< 16380 1727204142.73061: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdbf20> <<< 16380 1727204142.73139: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bcf590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 16380 1727204142.73205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.73249: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdf680> <<< 16380 1727204142.73558: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.73595: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549be0650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bdddf0> <<< 16380 1727204142.73698: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdf170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bdd9d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 16380 1727204142.73848: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.74032: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 16380 1727204142.74068: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 16380 1727204142.74171: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.74311: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.74544: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.75908: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.76900: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 16380 1727204142.77039: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204142.77225: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549c68830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 16380 1727204142.77260: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c69580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b85a30> <<< 16380 1727204142.77332: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 16380 1727204142.77364: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.77469: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 16380 1727204142.77483: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.77713: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.78024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 16380 1727204142.78046: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c69340> # zipimport: zlib available <<< 16380 1727204142.79033: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.79976: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80098: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80233: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 16380 1727204142.80263: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80312: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80416: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 16380 1727204142.80466: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80499: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80685: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 16380 1727204142.80723: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 16380 1727204142.80751: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.80920: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 16380 1727204142.81343: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.81923: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 16380 1727204142.81938: stdout chunk (state=3): >>>import '_ast' # <<< 16380 1727204142.82074: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c6bf50> <<< 16380 1727204142.82087: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.82206: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.82338: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 16380 1727204142.82394: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 16380 1727204142.82417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 16380 1727204142.82505: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 16380 1727204142.82754: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549a75f10> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549a76840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c6b290> <<< 16380 1727204142.82778: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.82835: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.82891: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 16380 1727204142.82936: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.83008: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.83121: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 16380 1727204142.83244: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 16380 1727204142.83354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 16380 1727204142.83481: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549a756d0> <<< 16380 1727204142.83501: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a76a80> <<< 16380 1727204142.83538: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 16380 1727204142.83650: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.83937: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 16380 1727204142.83941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 16380 1727204142.84036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 16380 1727204142.84148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b06d50> <<< 16380 1727204142.84223: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a80b30> <<< 16380 1727204142.84353: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a7eb70> <<< 16380 1727204142.84371: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a7e9c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 16380 1727204142.84465: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.84495: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 16380 1727204142.84528: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 16380 1727204142.84594: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 16380 1727204142.84612: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.84837: stdout chunk (state=3): >>># zipimport: zlib available <<< 16380 1727204142.85358: stdout chunk (state=3): >>># zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 16380 1727204142.85855: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 16380 1727204142.85905: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 16380 1727204142.86044: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools <<< 16380 1727204142.86066: stdout chunk (state=3): >>># cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 16380 1727204142.86340: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 16380 1727204142.86492: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array <<< 16380 1727204142.86591: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 16380 1727204142.86689: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 16380 1727204142.86698: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 16380 1727204142.86938: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 16380 1727204142.86942: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 16380 1727204142.86978: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 16380 1727204142.86982: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 16380 1727204142.87005: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 16380 1727204142.87047: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 16380 1727204142.87075: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 16380 1727204142.87167: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 16380 1727204142.87313: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 16380 1727204142.87337: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 16380 1727204142.88041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204142.88045: stdout chunk (state=3): >>><<< 16380 1727204142.88048: stderr chunk (state=3): >>><<< 16380 1727204142.88415: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a12c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a0fbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a12ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f410a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f41fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fb78c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fb7f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f97b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f952b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fdb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fda4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549fd8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a00ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a00cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549f7ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a028740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a029e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a02ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a02b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a02a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa54a02be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a02b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549debd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549e14860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e145c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549e14890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549e14a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549de9ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e16180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e14e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa54a00ec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e3e510> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e5a690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e8f410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549eb9bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e8f530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e5b320> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c944a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e596d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549e170b0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa549e597f0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_oaunjx1b/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cee0c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cc5040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cc41a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549cc7fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549d19b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d19910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d19220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d19c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549ceeb70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549d1a930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549d1ab70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549d1b0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b7ce90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549b7eab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b7f3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b80590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b83080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549b831d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b81340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b86ff0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b85ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b85820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b87f80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b81850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bcf110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bcf290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bd0e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bd0c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bd33b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bd1550> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bdaba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bd3530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdbe60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdba10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdbf20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bcf590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdf680> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549be0650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bdddf0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549bdf170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549bdd9d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549c68830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c69580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b85a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c69340> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c6bf50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549a75f10> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549a76840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549c6b290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa549a756d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a76a80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549b06d50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a80b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a7eb70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa549a7e9c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 16380 1727204142.89678: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204142.89682: _low_level_execute_command(): starting 16380 1727204142.89685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204142.0461764-16575-201094406905087/ > /dev/null 2>&1 && sleep 0' 16380 1727204142.89894: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204142.89915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204142.89953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204142.89966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204142.90086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204142.90260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204142.93000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204142.93171: stderr chunk (state=3): >>><<< 16380 1727204142.93174: stdout chunk (state=3): >>><<< 16380 1727204142.93177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204142.93179: handler run complete 16380 1727204142.93181: attempt loop complete, returning result 16380 1727204142.93184: _execute() done 16380 1727204142.93186: dumping result to json 16380 1727204142.93188: done dumping result, returning 16380 1727204142.93193: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [12b410aa-8751-749c-b6eb-000000000091] 16380 1727204142.93196: sending task result for task 12b410aa-8751-749c-b6eb-000000000091 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16380 1727204142.93495: no more pending results, returning what we have 16380 1727204142.93499: results queue empty 16380 1727204142.93500: checking for any_errors_fatal 16380 1727204142.93507: done checking for any_errors_fatal 16380 1727204142.93794: checking for max_fail_percentage 16380 1727204142.93797: done checking for max_fail_percentage 16380 1727204142.93798: checking to see if all hosts have failed and the running result is not ok 16380 1727204142.93799: done checking to see if all hosts have failed 16380 1727204142.93800: getting the remaining hosts for this loop 16380 1727204142.93802: done getting the remaining hosts for this loop 16380 1727204142.93807: getting the next task for host managed-node2 16380 1727204142.93817: done getting next task for host managed-node2 16380 1727204142.93821: ^ task is: TASK: Set flag to indicate system is ostree 16380 1727204142.93824: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204142.93828: getting variables 16380 1727204142.93830: in VariableManager get_vars() 16380 1727204142.93862: Calling all_inventory to load vars for managed-node2 16380 1727204142.93865: Calling groups_inventory to load vars for managed-node2 16380 1727204142.93869: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204142.93999: Calling all_plugins_play to load vars for managed-node2 16380 1727204142.94003: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204142.94012: done sending task result for task 12b410aa-8751-749c-b6eb-000000000091 16380 1727204142.94015: WORKER PROCESS EXITING 16380 1727204142.94021: Calling groups_plugins_play to load vars for managed-node2 16380 1727204142.94637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204142.95255: done with get_vars() 16380 1727204142.95269: done getting variables 16380 1727204142.95504: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.981) 0:00:04.061 ***** 16380 1727204142.95540: entering _queue_task() for managed-node2/set_fact 16380 1727204142.95543: Creating lock for set_fact 16380 1727204142.96203: worker is 1 (out of 1 available) 16380 1727204142.96219: exiting _queue_task() for managed-node2/set_fact 16380 1727204142.96233: done queuing things up, now waiting for results queue to drain 16380 1727204142.96236: waiting for pending results... 16380 1727204142.96650: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 16380 1727204142.96968: in run() - task 12b410aa-8751-749c-b6eb-000000000092 16380 1727204142.96992: variable 'ansible_search_path' from source: unknown 16380 1727204142.97001: variable 'ansible_search_path' from source: unknown 16380 1727204142.97102: calling self._execute() 16380 1727204142.97370: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204142.97374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204142.97377: variable 'omit' from source: magic vars 16380 1727204142.98396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204142.99181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204142.99312: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204142.99414: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204142.99516: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204142.99798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204142.99841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204142.99881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204143.00198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204143.00287: Evaluated conditional (not __network_is_ostree is defined): True 16380 1727204143.00321: variable 'omit' from source: magic vars 16380 1727204143.00469: variable 'omit' from source: magic vars 16380 1727204143.00716: variable '__ostree_booted_stat' from source: set_fact 16380 1727204143.00917: variable 'omit' from source: magic vars 16380 1727204143.00999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204143.01042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204143.01260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204143.01294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204143.01318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204143.01359: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204143.01367: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.01382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.01633: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204143.01713: Set connection var ansible_shell_executable to /bin/sh 16380 1727204143.01731: Set connection var ansible_connection to ssh 16380 1727204143.01743: Set connection var ansible_shell_type to sh 16380 1727204143.01756: Set connection var ansible_pipelining to False 16380 1727204143.01772: Set connection var ansible_timeout to 10 16380 1727204143.01847: variable 'ansible_shell_executable' from source: unknown 16380 1727204143.02036: variable 'ansible_connection' from source: unknown 16380 1727204143.02039: variable 'ansible_module_compression' from source: unknown 16380 1727204143.02041: variable 'ansible_shell_type' from source: unknown 16380 1727204143.02044: variable 'ansible_shell_executable' from source: unknown 16380 1727204143.02046: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.02049: variable 'ansible_pipelining' from source: unknown 16380 1727204143.02051: variable 'ansible_timeout' from source: unknown 16380 1727204143.02052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.02212: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204143.02268: variable 'omit' from source: magic vars 16380 1727204143.02497: starting attempt loop 16380 1727204143.02501: running the handler 16380 1727204143.02504: handler run complete 16380 1727204143.02506: attempt loop complete, returning result 16380 1727204143.02511: _execute() done 16380 1727204143.02514: dumping result to json 16380 1727204143.02516: done dumping result, returning 16380 1727204143.02518: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [12b410aa-8751-749c-b6eb-000000000092] 16380 1727204143.02520: sending task result for task 12b410aa-8751-749c-b6eb-000000000092 16380 1727204143.02593: done sending task result for task 12b410aa-8751-749c-b6eb-000000000092 16380 1727204143.02597: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 16380 1727204143.02663: no more pending results, returning what we have 16380 1727204143.02667: results queue empty 16380 1727204143.02668: checking for any_errors_fatal 16380 1727204143.02676: done checking for any_errors_fatal 16380 1727204143.02677: checking for max_fail_percentage 16380 1727204143.02678: done checking for max_fail_percentage 16380 1727204143.02679: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.02680: done checking to see if all hosts have failed 16380 1727204143.02681: getting the remaining hosts for this loop 16380 1727204143.02683: done getting the remaining hosts for this loop 16380 1727204143.02688: getting the next task for host managed-node2 16380 1727204143.02701: done getting next task for host managed-node2 16380 1727204143.02705: ^ task is: TASK: Fix CentOS6 Base repo 16380 1727204143.02708: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.02715: getting variables 16380 1727204143.02717: in VariableManager get_vars() 16380 1727204143.02753: Calling all_inventory to load vars for managed-node2 16380 1727204143.02757: Calling groups_inventory to load vars for managed-node2 16380 1727204143.02761: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.02775: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.02779: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.02933: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.03572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.04155: done with get_vars() 16380 1727204143.04169: done getting variables 16380 1727204143.04545: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.090) 0:00:04.152 ***** 16380 1727204143.04581: entering _queue_task() for managed-node2/copy 16380 1727204143.05235: worker is 1 (out of 1 available) 16380 1727204143.05248: exiting _queue_task() for managed-node2/copy 16380 1727204143.05261: done queuing things up, now waiting for results queue to drain 16380 1727204143.05264: waiting for pending results... 16380 1727204143.05483: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 16380 1727204143.05638: in run() - task 12b410aa-8751-749c-b6eb-000000000094 16380 1727204143.05643: variable 'ansible_search_path' from source: unknown 16380 1727204143.05647: variable 'ansible_search_path' from source: unknown 16380 1727204143.05650: calling self._execute() 16380 1727204143.05746: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.05751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.05762: variable 'omit' from source: magic vars 16380 1727204143.06380: variable 'ansible_distribution' from source: facts 16380 1727204143.06434: Evaluated conditional (ansible_distribution == 'CentOS'): False 16380 1727204143.06440: when evaluation is False, skipping this task 16380 1727204143.06446: _execute() done 16380 1727204143.06448: dumping result to json 16380 1727204143.06450: done dumping result, returning 16380 1727204143.06496: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [12b410aa-8751-749c-b6eb-000000000094] 16380 1727204143.06499: sending task result for task 12b410aa-8751-749c-b6eb-000000000094 16380 1727204143.06606: done sending task result for task 12b410aa-8751-749c-b6eb-000000000094 16380 1727204143.06611: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 16380 1727204143.06893: no more pending results, returning what we have 16380 1727204143.06898: results queue empty 16380 1727204143.06899: checking for any_errors_fatal 16380 1727204143.06905: done checking for any_errors_fatal 16380 1727204143.06906: checking for max_fail_percentage 16380 1727204143.06908: done checking for max_fail_percentage 16380 1727204143.06909: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.06910: done checking to see if all hosts have failed 16380 1727204143.06911: getting the remaining hosts for this loop 16380 1727204143.06912: done getting the remaining hosts for this loop 16380 1727204143.06916: getting the next task for host managed-node2 16380 1727204143.06923: done getting next task for host managed-node2 16380 1727204143.06926: ^ task is: TASK: Include the task 'enable_epel.yml' 16380 1727204143.06929: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.06933: getting variables 16380 1727204143.06935: in VariableManager get_vars() 16380 1727204143.06963: Calling all_inventory to load vars for managed-node2 16380 1727204143.06966: Calling groups_inventory to load vars for managed-node2 16380 1727204143.06970: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.06981: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.06985: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.06994: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.07577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.07890: done with get_vars() 16380 1727204143.07904: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.034) 0:00:04.186 ***** 16380 1727204143.08017: entering _queue_task() for managed-node2/include_tasks 16380 1727204143.08265: worker is 1 (out of 1 available) 16380 1727204143.08278: exiting _queue_task() for managed-node2/include_tasks 16380 1727204143.08293: done queuing things up, now waiting for results queue to drain 16380 1727204143.08295: waiting for pending results... 16380 1727204143.08911: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 16380 1727204143.08916: in run() - task 12b410aa-8751-749c-b6eb-000000000095 16380 1727204143.08920: variable 'ansible_search_path' from source: unknown 16380 1727204143.08926: variable 'ansible_search_path' from source: unknown 16380 1727204143.08930: calling self._execute() 16380 1727204143.09010: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.09020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.09032: variable 'omit' from source: magic vars 16380 1727204143.09614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204143.11666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204143.11670: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204143.11715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204143.11757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204143.11786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204143.11881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204143.11937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204143.11977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204143.12067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204143.12134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204143.12619: variable '__network_is_ostree' from source: set_fact 16380 1727204143.12622: Evaluated conditional (not __network_is_ostree | d(false)): True 16380 1727204143.12625: _execute() done 16380 1727204143.12627: dumping result to json 16380 1727204143.12629: done dumping result, returning 16380 1727204143.12632: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-749c-b6eb-000000000095] 16380 1727204143.12634: sending task result for task 12b410aa-8751-749c-b6eb-000000000095 16380 1727204143.12828: no more pending results, returning what we have 16380 1727204143.12834: in VariableManager get_vars() 16380 1727204143.12867: Calling all_inventory to load vars for managed-node2 16380 1727204143.12872: Calling groups_inventory to load vars for managed-node2 16380 1727204143.12876: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.12888: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.12893: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.12897: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.13253: done sending task result for task 12b410aa-8751-749c-b6eb-000000000095 16380 1727204143.13257: WORKER PROCESS EXITING 16380 1727204143.13269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.13424: done with get_vars() 16380 1727204143.13431: variable 'ansible_search_path' from source: unknown 16380 1727204143.13432: variable 'ansible_search_path' from source: unknown 16380 1727204143.13462: we have included files to process 16380 1727204143.13463: generating all_blocks data 16380 1727204143.13464: done generating all_blocks data 16380 1727204143.13468: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 16380 1727204143.13469: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 16380 1727204143.13471: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 16380 1727204143.14039: done processing included file 16380 1727204143.14040: iterating over new_blocks loaded from include file 16380 1727204143.14041: in VariableManager get_vars() 16380 1727204143.14051: done with get_vars() 16380 1727204143.14052: filtering new block on tags 16380 1727204143.14070: done filtering new block on tags 16380 1727204143.14072: in VariableManager get_vars() 16380 1727204143.14098: done with get_vars() 16380 1727204143.14100: filtering new block on tags 16380 1727204143.14111: done filtering new block on tags 16380 1727204143.14113: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 16380 1727204143.14117: extending task lists for all hosts with included blocks 16380 1727204143.14198: done extending task lists 16380 1727204143.14199: done processing included files 16380 1727204143.14199: results queue empty 16380 1727204143.14200: checking for any_errors_fatal 16380 1727204143.14202: done checking for any_errors_fatal 16380 1727204143.14203: checking for max_fail_percentage 16380 1727204143.14203: done checking for max_fail_percentage 16380 1727204143.14204: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.14205: done checking to see if all hosts have failed 16380 1727204143.14205: getting the remaining hosts for this loop 16380 1727204143.14206: done getting the remaining hosts for this loop 16380 1727204143.14208: getting the next task for host managed-node2 16380 1727204143.14212: done getting next task for host managed-node2 16380 1727204143.14213: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 16380 1727204143.14215: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.14217: getting variables 16380 1727204143.14217: in VariableManager get_vars() 16380 1727204143.14224: Calling all_inventory to load vars for managed-node2 16380 1727204143.14225: Calling groups_inventory to load vars for managed-node2 16380 1727204143.14227: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.14231: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.14236: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.14239: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.14347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.14496: done with get_vars() 16380 1727204143.14502: done getting variables 16380 1727204143.14557: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 16380 1727204143.14821: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.068) 0:00:04.255 ***** 16380 1727204143.14875: entering _queue_task() for managed-node2/command 16380 1727204143.14877: Creating lock for command 16380 1727204143.15120: worker is 1 (out of 1 available) 16380 1727204143.15133: exiting _queue_task() for managed-node2/command 16380 1727204143.15149: done queuing things up, now waiting for results queue to drain 16380 1727204143.15151: waiting for pending results... 16380 1727204143.15427: running TaskExecutor() for managed-node2/TASK: Create EPEL 39 16380 1727204143.15522: in run() - task 12b410aa-8751-749c-b6eb-0000000000af 16380 1727204143.15526: variable 'ansible_search_path' from source: unknown 16380 1727204143.15529: variable 'ansible_search_path' from source: unknown 16380 1727204143.15597: calling self._execute() 16380 1727204143.15648: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.15655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.15698: variable 'omit' from source: magic vars 16380 1727204143.16135: variable 'ansible_distribution' from source: facts 16380 1727204143.16155: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 16380 1727204143.16166: when evaluation is False, skipping this task 16380 1727204143.16395: _execute() done 16380 1727204143.16398: dumping result to json 16380 1727204143.16401: done dumping result, returning 16380 1727204143.16404: done running TaskExecutor() for managed-node2/TASK: Create EPEL 39 [12b410aa-8751-749c-b6eb-0000000000af] 16380 1727204143.16406: sending task result for task 12b410aa-8751-749c-b6eb-0000000000af 16380 1727204143.16479: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000af 16380 1727204143.16483: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 16380 1727204143.16531: no more pending results, returning what we have 16380 1727204143.16534: results queue empty 16380 1727204143.16535: checking for any_errors_fatal 16380 1727204143.16536: done checking for any_errors_fatal 16380 1727204143.16537: checking for max_fail_percentage 16380 1727204143.16539: done checking for max_fail_percentage 16380 1727204143.16540: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.16541: done checking to see if all hosts have failed 16380 1727204143.16542: getting the remaining hosts for this loop 16380 1727204143.16543: done getting the remaining hosts for this loop 16380 1727204143.16547: getting the next task for host managed-node2 16380 1727204143.16553: done getting next task for host managed-node2 16380 1727204143.16555: ^ task is: TASK: Install yum-utils package 16380 1727204143.16559: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.16562: getting variables 16380 1727204143.16564: in VariableManager get_vars() 16380 1727204143.16591: Calling all_inventory to load vars for managed-node2 16380 1727204143.16595: Calling groups_inventory to load vars for managed-node2 16380 1727204143.16599: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.16609: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.16612: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.16617: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.16888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.17088: done with get_vars() 16380 1727204143.17100: done getting variables 16380 1727204143.17204: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.023) 0:00:04.278 ***** 16380 1727204143.17238: entering _queue_task() for managed-node2/package 16380 1727204143.17239: Creating lock for package 16380 1727204143.17460: worker is 1 (out of 1 available) 16380 1727204143.17473: exiting _queue_task() for managed-node2/package 16380 1727204143.17485: done queuing things up, now waiting for results queue to drain 16380 1727204143.17487: waiting for pending results... 16380 1727204143.17739: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 16380 1727204143.17875: in run() - task 12b410aa-8751-749c-b6eb-0000000000b0 16380 1727204143.17900: variable 'ansible_search_path' from source: unknown 16380 1727204143.17911: variable 'ansible_search_path' from source: unknown 16380 1727204143.17957: calling self._execute() 16380 1727204143.18094: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.18098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.18101: variable 'omit' from source: magic vars 16380 1727204143.18508: variable 'ansible_distribution' from source: facts 16380 1727204143.18528: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 16380 1727204143.18538: when evaluation is False, skipping this task 16380 1727204143.18550: _execute() done 16380 1727204143.18562: dumping result to json 16380 1727204143.18584: done dumping result, returning 16380 1727204143.18620: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [12b410aa-8751-749c-b6eb-0000000000b0] 16380 1727204143.18623: sending task result for task 12b410aa-8751-749c-b6eb-0000000000b0 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 16380 1727204143.18757: no more pending results, returning what we have 16380 1727204143.18760: results queue empty 16380 1727204143.18761: checking for any_errors_fatal 16380 1727204143.18767: done checking for any_errors_fatal 16380 1727204143.18768: checking for max_fail_percentage 16380 1727204143.18770: done checking for max_fail_percentage 16380 1727204143.18770: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.18771: done checking to see if all hosts have failed 16380 1727204143.18772: getting the remaining hosts for this loop 16380 1727204143.18774: done getting the remaining hosts for this loop 16380 1727204143.18781: getting the next task for host managed-node2 16380 1727204143.18787: done getting next task for host managed-node2 16380 1727204143.18791: ^ task is: TASK: Enable EPEL 7 16380 1727204143.18795: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.18798: getting variables 16380 1727204143.18800: in VariableManager get_vars() 16380 1727204143.18826: Calling all_inventory to load vars for managed-node2 16380 1727204143.18830: Calling groups_inventory to load vars for managed-node2 16380 1727204143.18833: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.18841: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.18843: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.18847: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.18976: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000b0 16380 1727204143.18980: WORKER PROCESS EXITING 16380 1727204143.18995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.19171: done with get_vars() 16380 1727204143.19178: done getting variables 16380 1727204143.19226: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.020) 0:00:04.299 ***** 16380 1727204143.19248: entering _queue_task() for managed-node2/command 16380 1727204143.19479: worker is 1 (out of 1 available) 16380 1727204143.19494: exiting _queue_task() for managed-node2/command 16380 1727204143.19507: done queuing things up, now waiting for results queue to drain 16380 1727204143.19511: waiting for pending results... 16380 1727204143.19838: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 16380 1727204143.19863: in run() - task 12b410aa-8751-749c-b6eb-0000000000b1 16380 1727204143.19878: variable 'ansible_search_path' from source: unknown 16380 1727204143.19882: variable 'ansible_search_path' from source: unknown 16380 1727204143.19923: calling self._execute() 16380 1727204143.20007: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.20022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.20035: variable 'omit' from source: magic vars 16380 1727204143.20456: variable 'ansible_distribution' from source: facts 16380 1727204143.20474: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 16380 1727204143.20482: when evaluation is False, skipping this task 16380 1727204143.20490: _execute() done 16380 1727204143.20498: dumping result to json 16380 1727204143.20549: done dumping result, returning 16380 1727204143.20552: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [12b410aa-8751-749c-b6eb-0000000000b1] 16380 1727204143.20554: sending task result for task 12b410aa-8751-749c-b6eb-0000000000b1 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 16380 1727204143.20703: no more pending results, returning what we have 16380 1727204143.20707: results queue empty 16380 1727204143.20708: checking for any_errors_fatal 16380 1727204143.20718: done checking for any_errors_fatal 16380 1727204143.20719: checking for max_fail_percentage 16380 1727204143.20720: done checking for max_fail_percentage 16380 1727204143.20721: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.20722: done checking to see if all hosts have failed 16380 1727204143.20723: getting the remaining hosts for this loop 16380 1727204143.20725: done getting the remaining hosts for this loop 16380 1727204143.20729: getting the next task for host managed-node2 16380 1727204143.20737: done getting next task for host managed-node2 16380 1727204143.20740: ^ task is: TASK: Enable EPEL 8 16380 1727204143.20745: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.20748: getting variables 16380 1727204143.20750: in VariableManager get_vars() 16380 1727204143.20779: Calling all_inventory to load vars for managed-node2 16380 1727204143.20783: Calling groups_inventory to load vars for managed-node2 16380 1727204143.20787: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.20802: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.20805: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.20811: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.21354: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000b1 16380 1727204143.21359: WORKER PROCESS EXITING 16380 1727204143.21393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.21666: done with get_vars() 16380 1727204143.21679: done getting variables 16380 1727204143.21749: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.025) 0:00:04.324 ***** 16380 1727204143.21784: entering _queue_task() for managed-node2/command 16380 1727204143.22058: worker is 1 (out of 1 available) 16380 1727204143.22071: exiting _queue_task() for managed-node2/command 16380 1727204143.22085: done queuing things up, now waiting for results queue to drain 16380 1727204143.22088: waiting for pending results... 16380 1727204143.22403: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 16380 1727204143.22493: in run() - task 12b410aa-8751-749c-b6eb-0000000000b2 16380 1727204143.22516: variable 'ansible_search_path' from source: unknown 16380 1727204143.22520: variable 'ansible_search_path' from source: unknown 16380 1727204143.22554: calling self._execute() 16380 1727204143.22626: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.22631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.22641: variable 'omit' from source: magic vars 16380 1727204143.22966: variable 'ansible_distribution' from source: facts 16380 1727204143.22977: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 16380 1727204143.22982: when evaluation is False, skipping this task 16380 1727204143.22985: _execute() done 16380 1727204143.22988: dumping result to json 16380 1727204143.22998: done dumping result, returning 16380 1727204143.23001: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [12b410aa-8751-749c-b6eb-0000000000b2] 16380 1727204143.23013: sending task result for task 12b410aa-8751-749c-b6eb-0000000000b2 16380 1727204143.23105: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000b2 16380 1727204143.23112: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 16380 1727204143.23185: no more pending results, returning what we have 16380 1727204143.23188: results queue empty 16380 1727204143.23191: checking for any_errors_fatal 16380 1727204143.23195: done checking for any_errors_fatal 16380 1727204143.23196: checking for max_fail_percentage 16380 1727204143.23198: done checking for max_fail_percentage 16380 1727204143.23199: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.23199: done checking to see if all hosts have failed 16380 1727204143.23200: getting the remaining hosts for this loop 16380 1727204143.23202: done getting the remaining hosts for this loop 16380 1727204143.23206: getting the next task for host managed-node2 16380 1727204143.23218: done getting next task for host managed-node2 16380 1727204143.23228: ^ task is: TASK: Enable EPEL 6 16380 1727204143.23232: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.23236: getting variables 16380 1727204143.23238: in VariableManager get_vars() 16380 1727204143.23262: Calling all_inventory to load vars for managed-node2 16380 1727204143.23264: Calling groups_inventory to load vars for managed-node2 16380 1727204143.23267: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.23274: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.23276: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.23279: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.23451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.23611: done with get_vars() 16380 1727204143.23619: done getting variables 16380 1727204143.23668: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.019) 0:00:04.343 ***** 16380 1727204143.23694: entering _queue_task() for managed-node2/copy 16380 1727204143.23905: worker is 1 (out of 1 available) 16380 1727204143.23921: exiting _queue_task() for managed-node2/copy 16380 1727204143.23933: done queuing things up, now waiting for results queue to drain 16380 1727204143.23935: waiting for pending results... 16380 1727204143.24092: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 16380 1727204143.24172: in run() - task 12b410aa-8751-749c-b6eb-0000000000b4 16380 1727204143.24184: variable 'ansible_search_path' from source: unknown 16380 1727204143.24190: variable 'ansible_search_path' from source: unknown 16380 1727204143.24222: calling self._execute() 16380 1727204143.24286: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.24290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.24303: variable 'omit' from source: magic vars 16380 1727204143.24629: variable 'ansible_distribution' from source: facts 16380 1727204143.24641: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 16380 1727204143.24644: when evaluation is False, skipping this task 16380 1727204143.24647: _execute() done 16380 1727204143.24650: dumping result to json 16380 1727204143.24655: done dumping result, returning 16380 1727204143.24663: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [12b410aa-8751-749c-b6eb-0000000000b4] 16380 1727204143.24668: sending task result for task 12b410aa-8751-749c-b6eb-0000000000b4 16380 1727204143.24771: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000b4 16380 1727204143.24774: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 16380 1727204143.24829: no more pending results, returning what we have 16380 1727204143.24832: results queue empty 16380 1727204143.24833: checking for any_errors_fatal 16380 1727204143.24838: done checking for any_errors_fatal 16380 1727204143.24839: checking for max_fail_percentage 16380 1727204143.24841: done checking for max_fail_percentage 16380 1727204143.24841: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.24842: done checking to see if all hosts have failed 16380 1727204143.24843: getting the remaining hosts for this loop 16380 1727204143.24845: done getting the remaining hosts for this loop 16380 1727204143.24848: getting the next task for host managed-node2 16380 1727204143.24857: done getting next task for host managed-node2 16380 1727204143.24860: ^ task is: TASK: Set network provider to 'nm' 16380 1727204143.24862: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.24866: getting variables 16380 1727204143.24867: in VariableManager get_vars() 16380 1727204143.24902: Calling all_inventory to load vars for managed-node2 16380 1727204143.24905: Calling groups_inventory to load vars for managed-node2 16380 1727204143.24911: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.24919: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.24921: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.24924: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.25062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.25219: done with get_vars() 16380 1727204143.25229: done getting variables 16380 1727204143.25275: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.016) 0:00:04.359 ***** 16380 1727204143.25299: entering _queue_task() for managed-node2/set_fact 16380 1727204143.25497: worker is 1 (out of 1 available) 16380 1727204143.25511: exiting _queue_task() for managed-node2/set_fact 16380 1727204143.25524: done queuing things up, now waiting for results queue to drain 16380 1727204143.25527: waiting for pending results... 16380 1727204143.25694: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 16380 1727204143.25755: in run() - task 12b410aa-8751-749c-b6eb-000000000007 16380 1727204143.25775: variable 'ansible_search_path' from source: unknown 16380 1727204143.25807: calling self._execute() 16380 1727204143.25877: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.25888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.25896: variable 'omit' from source: magic vars 16380 1727204143.25986: variable 'omit' from source: magic vars 16380 1727204143.26021: variable 'omit' from source: magic vars 16380 1727204143.26052: variable 'omit' from source: magic vars 16380 1727204143.26091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204143.26129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204143.26147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204143.26163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204143.26174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204143.26203: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204143.26206: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.26215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.26297: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204143.26305: Set connection var ansible_shell_executable to /bin/sh 16380 1727204143.26314: Set connection var ansible_connection to ssh 16380 1727204143.26323: Set connection var ansible_shell_type to sh 16380 1727204143.26330: Set connection var ansible_pipelining to False 16380 1727204143.26339: Set connection var ansible_timeout to 10 16380 1727204143.26359: variable 'ansible_shell_executable' from source: unknown 16380 1727204143.26363: variable 'ansible_connection' from source: unknown 16380 1727204143.26367: variable 'ansible_module_compression' from source: unknown 16380 1727204143.26370: variable 'ansible_shell_type' from source: unknown 16380 1727204143.26374: variable 'ansible_shell_executable' from source: unknown 16380 1727204143.26377: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.26383: variable 'ansible_pipelining' from source: unknown 16380 1727204143.26386: variable 'ansible_timeout' from source: unknown 16380 1727204143.26392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.26519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204143.26530: variable 'omit' from source: magic vars 16380 1727204143.26537: starting attempt loop 16380 1727204143.26540: running the handler 16380 1727204143.26554: handler run complete 16380 1727204143.26564: attempt loop complete, returning result 16380 1727204143.26567: _execute() done 16380 1727204143.26570: dumping result to json 16380 1727204143.26575: done dumping result, returning 16380 1727204143.26582: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [12b410aa-8751-749c-b6eb-000000000007] 16380 1727204143.26587: sending task result for task 12b410aa-8751-749c-b6eb-000000000007 16380 1727204143.26682: done sending task result for task 12b410aa-8751-749c-b6eb-000000000007 16380 1727204143.26685: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 16380 1727204143.26742: no more pending results, returning what we have 16380 1727204143.26745: results queue empty 16380 1727204143.26746: checking for any_errors_fatal 16380 1727204143.26752: done checking for any_errors_fatal 16380 1727204143.26753: checking for max_fail_percentage 16380 1727204143.26755: done checking for max_fail_percentage 16380 1727204143.26755: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.26756: done checking to see if all hosts have failed 16380 1727204143.26757: getting the remaining hosts for this loop 16380 1727204143.26759: done getting the remaining hosts for this loop 16380 1727204143.26762: getting the next task for host managed-node2 16380 1727204143.26768: done getting next task for host managed-node2 16380 1727204143.26770: ^ task is: TASK: meta (flush_handlers) 16380 1727204143.26772: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.26777: getting variables 16380 1727204143.26778: in VariableManager get_vars() 16380 1727204143.26812: Calling all_inventory to load vars for managed-node2 16380 1727204143.26815: Calling groups_inventory to load vars for managed-node2 16380 1727204143.26818: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.26826: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.26829: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.26831: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.26991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.27144: done with get_vars() 16380 1727204143.27151: done getting variables 16380 1727204143.27203: in VariableManager get_vars() 16380 1727204143.27210: Calling all_inventory to load vars for managed-node2 16380 1727204143.27212: Calling groups_inventory to load vars for managed-node2 16380 1727204143.27214: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.27218: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.27220: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.27222: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.27330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.27475: done with get_vars() 16380 1727204143.27486: done queuing things up, now waiting for results queue to drain 16380 1727204143.27488: results queue empty 16380 1727204143.27488: checking for any_errors_fatal 16380 1727204143.27492: done checking for any_errors_fatal 16380 1727204143.27492: checking for max_fail_percentage 16380 1727204143.27493: done checking for max_fail_percentage 16380 1727204143.27494: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.27494: done checking to see if all hosts have failed 16380 1727204143.27495: getting the remaining hosts for this loop 16380 1727204143.27496: done getting the remaining hosts for this loop 16380 1727204143.27497: getting the next task for host managed-node2 16380 1727204143.27500: done getting next task for host managed-node2 16380 1727204143.27501: ^ task is: TASK: meta (flush_handlers) 16380 1727204143.27502: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.27508: getting variables 16380 1727204143.27510: in VariableManager get_vars() 16380 1727204143.27516: Calling all_inventory to load vars for managed-node2 16380 1727204143.27518: Calling groups_inventory to load vars for managed-node2 16380 1727204143.27519: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.27523: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.27525: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.27527: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.27635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.27937: done with get_vars() 16380 1727204143.27944: done getting variables 16380 1727204143.27977: in VariableManager get_vars() 16380 1727204143.27983: Calling all_inventory to load vars for managed-node2 16380 1727204143.27985: Calling groups_inventory to load vars for managed-node2 16380 1727204143.27987: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.27993: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.27996: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.27998: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.28100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.28244: done with get_vars() 16380 1727204143.28253: done queuing things up, now waiting for results queue to drain 16380 1727204143.28254: results queue empty 16380 1727204143.28255: checking for any_errors_fatal 16380 1727204143.28256: done checking for any_errors_fatal 16380 1727204143.28256: checking for max_fail_percentage 16380 1727204143.28257: done checking for max_fail_percentage 16380 1727204143.28257: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.28258: done checking to see if all hosts have failed 16380 1727204143.28259: getting the remaining hosts for this loop 16380 1727204143.28259: done getting the remaining hosts for this loop 16380 1727204143.28261: getting the next task for host managed-node2 16380 1727204143.28263: done getting next task for host managed-node2 16380 1727204143.28264: ^ task is: None 16380 1727204143.28265: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.28266: done queuing things up, now waiting for results queue to drain 16380 1727204143.28266: results queue empty 16380 1727204143.28267: checking for any_errors_fatal 16380 1727204143.28267: done checking for any_errors_fatal 16380 1727204143.28268: checking for max_fail_percentage 16380 1727204143.28269: done checking for max_fail_percentage 16380 1727204143.28269: checking to see if all hosts have failed and the running result is not ok 16380 1727204143.28270: done checking to see if all hosts have failed 16380 1727204143.28271: getting the next task for host managed-node2 16380 1727204143.28273: done getting next task for host managed-node2 16380 1727204143.28273: ^ task is: None 16380 1727204143.28274: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.28311: in VariableManager get_vars() 16380 1727204143.28325: done with get_vars() 16380 1727204143.28330: in VariableManager get_vars() 16380 1727204143.28337: done with get_vars() 16380 1727204143.28340: variable 'omit' from source: magic vars 16380 1727204143.28364: in VariableManager get_vars() 16380 1727204143.28371: done with get_vars() 16380 1727204143.28391: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 16380 1727204143.28545: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204143.28569: getting the remaining hosts for this loop 16380 1727204143.28570: done getting the remaining hosts for this loop 16380 1727204143.28572: getting the next task for host managed-node2 16380 1727204143.28574: done getting next task for host managed-node2 16380 1727204143.28575: ^ task is: TASK: Gathering Facts 16380 1727204143.28576: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204143.28578: getting variables 16380 1727204143.28578: in VariableManager get_vars() 16380 1727204143.28584: Calling all_inventory to load vars for managed-node2 16380 1727204143.28586: Calling groups_inventory to load vars for managed-node2 16380 1727204143.28588: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204143.28594: Calling all_plugins_play to load vars for managed-node2 16380 1727204143.28608: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204143.28612: Calling groups_plugins_play to load vars for managed-node2 16380 1727204143.28720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204143.28894: done with get_vars() 16380 1727204143.28900: done getting variables 16380 1727204143.28930: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.036) 0:00:04.396 ***** 16380 1727204143.28947: entering _queue_task() for managed-node2/gather_facts 16380 1727204143.29122: worker is 1 (out of 1 available) 16380 1727204143.29136: exiting _queue_task() for managed-node2/gather_facts 16380 1727204143.29150: done queuing things up, now waiting for results queue to drain 16380 1727204143.29153: waiting for pending results... 16380 1727204143.29304: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204143.29369: in run() - task 12b410aa-8751-749c-b6eb-0000000000da 16380 1727204143.29383: variable 'ansible_search_path' from source: unknown 16380 1727204143.29418: calling self._execute() 16380 1727204143.29478: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.29485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.29501: variable 'omit' from source: magic vars 16380 1727204143.29800: variable 'ansible_distribution_major_version' from source: facts 16380 1727204143.29812: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204143.29819: variable 'omit' from source: magic vars 16380 1727204143.29845: variable 'omit' from source: magic vars 16380 1727204143.29874: variable 'omit' from source: magic vars 16380 1727204143.29909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204143.29945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204143.29962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204143.29979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204143.29992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204143.30021: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204143.30024: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.30027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.30109: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204143.30119: Set connection var ansible_shell_executable to /bin/sh 16380 1727204143.30125: Set connection var ansible_connection to ssh 16380 1727204143.30132: Set connection var ansible_shell_type to sh 16380 1727204143.30139: Set connection var ansible_pipelining to False 16380 1727204143.30152: Set connection var ansible_timeout to 10 16380 1727204143.30171: variable 'ansible_shell_executable' from source: unknown 16380 1727204143.30174: variable 'ansible_connection' from source: unknown 16380 1727204143.30177: variable 'ansible_module_compression' from source: unknown 16380 1727204143.30180: variable 'ansible_shell_type' from source: unknown 16380 1727204143.30185: variable 'ansible_shell_executable' from source: unknown 16380 1727204143.30191: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204143.30196: variable 'ansible_pipelining' from source: unknown 16380 1727204143.30199: variable 'ansible_timeout' from source: unknown 16380 1727204143.30205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204143.30357: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204143.30367: variable 'omit' from source: magic vars 16380 1727204143.30378: starting attempt loop 16380 1727204143.30381: running the handler 16380 1727204143.30393: variable 'ansible_facts' from source: unknown 16380 1727204143.30412: _low_level_execute_command(): starting 16380 1727204143.30421: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204143.30963: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204143.30969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204143.30972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204143.31030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204143.31033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204143.31117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204143.33681: stdout chunk (state=3): >>>/root <<< 16380 1727204143.33842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204143.33896: stderr chunk (state=3): >>><<< 16380 1727204143.33900: stdout chunk (state=3): >>><<< 16380 1727204143.33925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204143.33941: _low_level_execute_command(): starting 16380 1727204143.33949: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955 `" && echo ansible-tmp-1727204143.3392515-16673-149686081596955="` echo /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955 `" ) && sleep 0' 16380 1727204143.34413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204143.34417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204143.34419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204143.34429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204143.34468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204143.34476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204143.34524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204143.37444: stdout chunk (state=3): >>>ansible-tmp-1727204143.3392515-16673-149686081596955=/root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955 <<< 16380 1727204143.37899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204143.37905: stdout chunk (state=3): >>><<< 16380 1727204143.37908: stderr chunk (state=3): >>><<< 16380 1727204143.37911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204143.3392515-16673-149686081596955=/root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204143.37914: variable 'ansible_module_compression' from source: unknown 16380 1727204143.37917: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204143.38066: variable 'ansible_facts' from source: unknown 16380 1727204143.38285: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py 16380 1727204143.38506: Sending initial data 16380 1727204143.38510: Sent initial data (154 bytes) 16380 1727204143.39594: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204143.39598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204143.39643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204143.39688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204143.42056: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204143.42129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204143.42178: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpxhzzbnpc /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py <<< 16380 1727204143.42181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py" <<< 16380 1727204143.42185: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpxhzzbnpc" to remote "/root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py" <<< 16380 1727204143.45740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204143.45745: stdout chunk (state=3): >>><<< 16380 1727204143.45747: stderr chunk (state=3): >>><<< 16380 1727204143.45750: done transferring module to remote 16380 1727204143.45752: _low_level_execute_command(): starting 16380 1727204143.45755: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/ /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py && sleep 0' 16380 1727204143.46333: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204143.46351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204143.46367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204143.46392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204143.46417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204143.46515: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204143.46543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204143.46568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204143.46586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204143.46679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204143.49518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204143.49542: stdout chunk (state=3): >>><<< 16380 1727204143.49568: stderr chunk (state=3): >>><<< 16380 1727204143.49594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204143.49606: _low_level_execute_command(): starting 16380 1727204143.49620: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/AnsiballZ_setup.py && sleep 0' 16380 1727204143.50380: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204143.50399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204143.50420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204143.50479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204143.50560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204143.50600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204143.50621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204143.50720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204144.48074: stdout chunk (state=3): >>> <<< 16380 1727204144.48079: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "44", "epoch": "1727204144", "epoch_int": "1727204144", "date": "2024-09-24", "time": "14:55:44", "iso8601_micro": "2024-09-24T18:55:44.009745Z", "iso8601": "2024-09-24T18:55:44Z", "iso8601_basic": "20240924T145544009745", "iso8601_basic_short": "20240924T145544", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVa<<< 16380 1727204144.48101: stdout chunk (state=3): >>>OkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "manage<<< 16380 1727204144.48228: stdout chunk (state=3): >>>d-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_loadavg": {"1m": 0.51513671875, "5m": 0.541015625, "15m": 0.34033203125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2823, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 894, "free": 2823}, "nocache": {"free": 3450, "used": 267}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 648, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155963904, "block_size": 4096, "block_total": 64479564, "block_available": 61317374, "block_used": 3162190, "inode_total": 16384000, "inode_available": 16302247, "inode_used": 81753, "uuid": "97924df9-0e<<< 16380 1727204144.48316: stdout chunk (state=3): >>>6a-4a28-b439-92c447b04700"}], "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204144.51601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204144.51919: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 16380 1727204144.51922: stdout chunk (state=3): >>><<< 16380 1727204144.51925: stderr chunk (state=3): >>><<< 16380 1727204144.51930: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "44", "epoch": "1727204144", "epoch_int": "1727204144", "date": "2024-09-24", "time": "14:55:44", "iso8601_micro": "2024-09-24T18:55:44.009745Z", "iso8601": "2024-09-24T18:55:44Z", "iso8601_basic": "20240924T145544009745", "iso8601_basic_short": "20240924T145544", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_loadavg": {"1m": 0.51513671875, "5m": 0.541015625, "15m": 0.34033203125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2823, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 894, "free": 2823}, "nocache": {"free": 3450, "used": 267}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 648, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155963904, "block_size": 4096, "block_total": 64479564, "block_available": 61317374, "block_used": 3162190, "inode_total": 16384000, "inode_available": 16302247, "inode_used": 81753, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204144.53013: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204144.53277: _low_level_execute_command(): starting 16380 1727204144.53281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204143.3392515-16673-149686081596955/ > /dev/null 2>&1 && sleep 0' 16380 1727204144.54198: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204144.54215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204144.54423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204144.54485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204144.54502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204144.54517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204144.54637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204144.54717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204144.57693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204144.57748: stderr chunk (state=3): >>><<< 16380 1727204144.57752: stdout chunk (state=3): >>><<< 16380 1727204144.57769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204144.57796: handler run complete 16380 1727204144.58512: variable 'ansible_facts' from source: unknown 16380 1727204144.59042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.60145: variable 'ansible_facts' from source: unknown 16380 1727204144.60488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.60957: attempt loop complete, returning result 16380 1727204144.60968: _execute() done 16380 1727204144.61043: dumping result to json 16380 1727204144.61067: done dumping result, returning 16380 1727204144.61151: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-0000000000da] 16380 1727204144.61154: sending task result for task 12b410aa-8751-749c-b6eb-0000000000da 16380 1727204144.61827: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000da ok: [managed-node2] 16380 1727204144.63050: WORKER PROCESS EXITING 16380 1727204144.63317: no more pending results, returning what we have 16380 1727204144.63321: results queue empty 16380 1727204144.63322: checking for any_errors_fatal 16380 1727204144.63324: done checking for any_errors_fatal 16380 1727204144.63325: checking for max_fail_percentage 16380 1727204144.63327: done checking for max_fail_percentage 16380 1727204144.63328: checking to see if all hosts have failed and the running result is not ok 16380 1727204144.63329: done checking to see if all hosts have failed 16380 1727204144.63330: getting the remaining hosts for this loop 16380 1727204144.63332: done getting the remaining hosts for this loop 16380 1727204144.63337: getting the next task for host managed-node2 16380 1727204144.63343: done getting next task for host managed-node2 16380 1727204144.63345: ^ task is: TASK: meta (flush_handlers) 16380 1727204144.63347: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204144.63351: getting variables 16380 1727204144.63352: in VariableManager get_vars() 16380 1727204144.63377: Calling all_inventory to load vars for managed-node2 16380 1727204144.63381: Calling groups_inventory to load vars for managed-node2 16380 1727204144.63385: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.63504: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.63509: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.63513: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.63911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.64560: done with get_vars() 16380 1727204144.64652: done getting variables 16380 1727204144.64799: in VariableManager get_vars() 16380 1727204144.64811: Calling all_inventory to load vars for managed-node2 16380 1727204144.64814: Calling groups_inventory to load vars for managed-node2 16380 1727204144.64817: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.64823: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.64826: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.64830: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.65251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.65868: done with get_vars() 16380 1727204144.65884: done queuing things up, now waiting for results queue to drain 16380 1727204144.65887: results queue empty 16380 1727204144.65888: checking for any_errors_fatal 16380 1727204144.65951: done checking for any_errors_fatal 16380 1727204144.65952: checking for max_fail_percentage 16380 1727204144.65954: done checking for max_fail_percentage 16380 1727204144.65955: checking to see if all hosts have failed and the running result is not ok 16380 1727204144.65961: done checking to see if all hosts have failed 16380 1727204144.65962: getting the remaining hosts for this loop 16380 1727204144.65963: done getting the remaining hosts for this loop 16380 1727204144.65966: getting the next task for host managed-node2 16380 1727204144.65971: done getting next task for host managed-node2 16380 1727204144.65973: ^ task is: TASK: Set interface={{ interface }} 16380 1727204144.65975: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204144.65978: getting variables 16380 1727204144.65979: in VariableManager get_vars() 16380 1727204144.65991: Calling all_inventory to load vars for managed-node2 16380 1727204144.65993: Calling groups_inventory to load vars for managed-node2 16380 1727204144.65996: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.66002: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.66005: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.66009: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.66425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.67026: done with get_vars() 16380 1727204144.67071: done getting variables 16380 1727204144.67168: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204144.67424: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Tuesday 24 September 2024 14:55:44 -0400 (0:00:01.386) 0:00:05.782 ***** 16380 1727204144.67613: entering _queue_task() for managed-node2/set_fact 16380 1727204144.68354: worker is 1 (out of 1 available) 16380 1727204144.68368: exiting _queue_task() for managed-node2/set_fact 16380 1727204144.68382: done queuing things up, now waiting for results queue to drain 16380 1727204144.68384: waiting for pending results... 16380 1727204144.68907: running TaskExecutor() for managed-node2/TASK: Set interface=LSR-TST-br31 16380 1727204144.69216: in run() - task 12b410aa-8751-749c-b6eb-00000000000b 16380 1727204144.69221: variable 'ansible_search_path' from source: unknown 16380 1727204144.69253: calling self._execute() 16380 1727204144.69350: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.69433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.69475: variable 'omit' from source: magic vars 16380 1727204144.70495: variable 'ansible_distribution_major_version' from source: facts 16380 1727204144.70614: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204144.70618: variable 'omit' from source: magic vars 16380 1727204144.70621: variable 'omit' from source: magic vars 16380 1727204144.70623: variable 'interface' from source: play vars 16380 1727204144.70841: variable 'interface' from source: play vars 16380 1727204144.70867: variable 'omit' from source: magic vars 16380 1727204144.70987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204144.71037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204144.71351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204144.71354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204144.71460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204144.71464: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204144.71467: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.71469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.71713: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204144.71728: Set connection var ansible_shell_executable to /bin/sh 16380 1727204144.71741: Set connection var ansible_connection to ssh 16380 1727204144.71800: Set connection var ansible_shell_type to sh 16380 1727204144.71814: Set connection var ansible_pipelining to False 16380 1727204144.71829: Set connection var ansible_timeout to 10 16380 1727204144.71860: variable 'ansible_shell_executable' from source: unknown 16380 1727204144.71902: variable 'ansible_connection' from source: unknown 16380 1727204144.72095: variable 'ansible_module_compression' from source: unknown 16380 1727204144.72098: variable 'ansible_shell_type' from source: unknown 16380 1727204144.72101: variable 'ansible_shell_executable' from source: unknown 16380 1727204144.72103: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.72105: variable 'ansible_pipelining' from source: unknown 16380 1727204144.72107: variable 'ansible_timeout' from source: unknown 16380 1727204144.72111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.72352: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204144.72495: variable 'omit' from source: magic vars 16380 1727204144.72499: starting attempt loop 16380 1727204144.72501: running the handler 16380 1727204144.72504: handler run complete 16380 1727204144.72506: attempt loop complete, returning result 16380 1727204144.72513: _execute() done 16380 1727204144.72521: dumping result to json 16380 1727204144.72529: done dumping result, returning 16380 1727204144.72540: done running TaskExecutor() for managed-node2/TASK: Set interface=LSR-TST-br31 [12b410aa-8751-749c-b6eb-00000000000b] 16380 1727204144.72660: sending task result for task 12b410aa-8751-749c-b6eb-00000000000b 16380 1727204144.72732: done sending task result for task 12b410aa-8751-749c-b6eb-00000000000b 16380 1727204144.72736: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 16380 1727204144.72825: no more pending results, returning what we have 16380 1727204144.72829: results queue empty 16380 1727204144.72830: checking for any_errors_fatal 16380 1727204144.72832: done checking for any_errors_fatal 16380 1727204144.72833: checking for max_fail_percentage 16380 1727204144.72835: done checking for max_fail_percentage 16380 1727204144.72836: checking to see if all hosts have failed and the running result is not ok 16380 1727204144.72837: done checking to see if all hosts have failed 16380 1727204144.72838: getting the remaining hosts for this loop 16380 1727204144.72840: done getting the remaining hosts for this loop 16380 1727204144.72845: getting the next task for host managed-node2 16380 1727204144.72852: done getting next task for host managed-node2 16380 1727204144.72855: ^ task is: TASK: Include the task 'show_interfaces.yml' 16380 1727204144.72857: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204144.72863: getting variables 16380 1727204144.72865: in VariableManager get_vars() 16380 1727204144.72903: Calling all_inventory to load vars for managed-node2 16380 1727204144.72907: Calling groups_inventory to load vars for managed-node2 16380 1727204144.72912: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.72925: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.72928: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.72932: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.73894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.74579: done with get_vars() 16380 1727204144.74994: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.074) 0:00:05.857 ***** 16380 1727204144.75097: entering _queue_task() for managed-node2/include_tasks 16380 1727204144.75681: worker is 1 (out of 1 available) 16380 1727204144.76010: exiting _queue_task() for managed-node2/include_tasks 16380 1727204144.76023: done queuing things up, now waiting for results queue to drain 16380 1727204144.76025: waiting for pending results... 16380 1727204144.76520: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 16380 1727204144.76997: in run() - task 12b410aa-8751-749c-b6eb-00000000000c 16380 1727204144.77001: variable 'ansible_search_path' from source: unknown 16380 1727204144.77092: calling self._execute() 16380 1727204144.77430: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.77441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.77450: variable 'omit' from source: magic vars 16380 1727204144.78544: variable 'ansible_distribution_major_version' from source: facts 16380 1727204144.78763: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204144.78766: _execute() done 16380 1727204144.78769: dumping result to json 16380 1727204144.78771: done dumping result, returning 16380 1727204144.78774: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-749c-b6eb-00000000000c] 16380 1727204144.78777: sending task result for task 12b410aa-8751-749c-b6eb-00000000000c 16380 1727204144.78852: done sending task result for task 12b410aa-8751-749c-b6eb-00000000000c 16380 1727204144.78854: WORKER PROCESS EXITING 16380 1727204144.78926: no more pending results, returning what we have 16380 1727204144.78933: in VariableManager get_vars() 16380 1727204144.78971: Calling all_inventory to load vars for managed-node2 16380 1727204144.78975: Calling groups_inventory to load vars for managed-node2 16380 1727204144.78979: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.78998: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.79001: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.79004: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.79621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.80700: done with get_vars() 16380 1727204144.80711: variable 'ansible_search_path' from source: unknown 16380 1727204144.80732: we have included files to process 16380 1727204144.80734: generating all_blocks data 16380 1727204144.80736: done generating all_blocks data 16380 1727204144.80737: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 16380 1727204144.80738: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 16380 1727204144.80742: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 16380 1727204144.81369: in VariableManager get_vars() 16380 1727204144.81617: done with get_vars() 16380 1727204144.82063: done processing included file 16380 1727204144.82066: iterating over new_blocks loaded from include file 16380 1727204144.82067: in VariableManager get_vars() 16380 1727204144.82197: done with get_vars() 16380 1727204144.82201: filtering new block on tags 16380 1727204144.82224: done filtering new block on tags 16380 1727204144.82227: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 16380 1727204144.82233: extending task lists for all hosts with included blocks 16380 1727204144.82546: done extending task lists 16380 1727204144.82548: done processing included files 16380 1727204144.82549: results queue empty 16380 1727204144.82550: checking for any_errors_fatal 16380 1727204144.82555: done checking for any_errors_fatal 16380 1727204144.82556: checking for max_fail_percentage 16380 1727204144.82557: done checking for max_fail_percentage 16380 1727204144.82558: checking to see if all hosts have failed and the running result is not ok 16380 1727204144.82559: done checking to see if all hosts have failed 16380 1727204144.82560: getting the remaining hosts for this loop 16380 1727204144.82562: done getting the remaining hosts for this loop 16380 1727204144.82565: getting the next task for host managed-node2 16380 1727204144.82683: done getting next task for host managed-node2 16380 1727204144.82686: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 16380 1727204144.82691: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204144.82694: getting variables 16380 1727204144.82695: in VariableManager get_vars() 16380 1727204144.82705: Calling all_inventory to load vars for managed-node2 16380 1727204144.82708: Calling groups_inventory to load vars for managed-node2 16380 1727204144.82711: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.82717: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.82720: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.82724: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.83567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.84450: done with get_vars() 16380 1727204144.84461: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.096) 0:00:05.954 ***** 16380 1727204144.84753: entering _queue_task() for managed-node2/include_tasks 16380 1727204144.87428: worker is 1 (out of 1 available) 16380 1727204144.87441: exiting _queue_task() for managed-node2/include_tasks 16380 1727204144.87453: done queuing things up, now waiting for results queue to drain 16380 1727204144.87571: waiting for pending results... 16380 1727204144.88024: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 16380 1727204144.88445: in run() - task 12b410aa-8751-749c-b6eb-0000000000ee 16380 1727204144.88450: variable 'ansible_search_path' from source: unknown 16380 1727204144.88454: variable 'ansible_search_path' from source: unknown 16380 1727204144.88456: calling self._execute() 16380 1727204144.88460: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.88557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.88576: variable 'omit' from source: magic vars 16380 1727204144.89601: variable 'ansible_distribution_major_version' from source: facts 16380 1727204144.89624: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204144.89639: _execute() done 16380 1727204144.89653: dumping result to json 16380 1727204144.89757: done dumping result, returning 16380 1727204144.89760: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-749c-b6eb-0000000000ee] 16380 1727204144.89763: sending task result for task 12b410aa-8751-749c-b6eb-0000000000ee 16380 1727204144.89925: no more pending results, returning what we have 16380 1727204144.89931: in VariableManager get_vars() 16380 1727204144.89973: Calling all_inventory to load vars for managed-node2 16380 1727204144.89977: Calling groups_inventory to load vars for managed-node2 16380 1727204144.89982: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.89999: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.90003: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.90007: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.90562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.91135: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000ee 16380 1727204144.91139: WORKER PROCESS EXITING 16380 1727204144.91371: done with get_vars() 16380 1727204144.91380: variable 'ansible_search_path' from source: unknown 16380 1727204144.91382: variable 'ansible_search_path' from source: unknown 16380 1727204144.91429: we have included files to process 16380 1727204144.91431: generating all_blocks data 16380 1727204144.91433: done generating all_blocks data 16380 1727204144.91434: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 16380 1727204144.91436: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 16380 1727204144.91438: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 16380 1727204144.92402: done processing included file 16380 1727204144.92404: iterating over new_blocks loaded from include file 16380 1727204144.92406: in VariableManager get_vars() 16380 1727204144.92424: done with get_vars() 16380 1727204144.92426: filtering new block on tags 16380 1727204144.92517: done filtering new block on tags 16380 1727204144.92520: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 16380 1727204144.92526: extending task lists for all hosts with included blocks 16380 1727204144.92899: done extending task lists 16380 1727204144.92900: done processing included files 16380 1727204144.92901: results queue empty 16380 1727204144.92902: checking for any_errors_fatal 16380 1727204144.92906: done checking for any_errors_fatal 16380 1727204144.92907: checking for max_fail_percentage 16380 1727204144.92908: done checking for max_fail_percentage 16380 1727204144.92911: checking to see if all hosts have failed and the running result is not ok 16380 1727204144.92912: done checking to see if all hosts have failed 16380 1727204144.92913: getting the remaining hosts for this loop 16380 1727204144.92915: done getting the remaining hosts for this loop 16380 1727204144.92918: getting the next task for host managed-node2 16380 1727204144.92922: done getting next task for host managed-node2 16380 1727204144.92925: ^ task is: TASK: Gather current interface info 16380 1727204144.92929: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204144.92931: getting variables 16380 1727204144.92933: in VariableManager get_vars() 16380 1727204144.92942: Calling all_inventory to load vars for managed-node2 16380 1727204144.92945: Calling groups_inventory to load vars for managed-node2 16380 1727204144.92948: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204144.92954: Calling all_plugins_play to load vars for managed-node2 16380 1727204144.92957: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204144.92961: Calling groups_plugins_play to load vars for managed-node2 16380 1727204144.93379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204144.94121: done with get_vars() 16380 1727204144.94131: done getting variables 16380 1727204144.94176: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.094) 0:00:06.048 ***** 16380 1727204144.94215: entering _queue_task() for managed-node2/command 16380 1727204144.94687: worker is 1 (out of 1 available) 16380 1727204144.94912: exiting _queue_task() for managed-node2/command 16380 1727204144.94924: done queuing things up, now waiting for results queue to drain 16380 1727204144.94926: waiting for pending results... 16380 1727204144.95198: running TaskExecutor() for managed-node2/TASK: Gather current interface info 16380 1727204144.95533: in run() - task 12b410aa-8751-749c-b6eb-0000000000fd 16380 1727204144.95580: variable 'ansible_search_path' from source: unknown 16380 1727204144.95593: variable 'ansible_search_path' from source: unknown 16380 1727204144.95897: calling self._execute() 16380 1727204144.95924: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.95937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.95954: variable 'omit' from source: magic vars 16380 1727204144.96875: variable 'ansible_distribution_major_version' from source: facts 16380 1727204144.96971: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204144.96985: variable 'omit' from source: magic vars 16380 1727204144.97049: variable 'omit' from source: magic vars 16380 1727204144.97142: variable 'omit' from source: magic vars 16380 1727204144.97344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204144.97502: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204144.97936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204144.97940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204144.97943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204144.97946: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204144.98044: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.98048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.98140: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204144.98383: Set connection var ansible_shell_executable to /bin/sh 16380 1727204144.98399: Set connection var ansible_connection to ssh 16380 1727204144.98696: Set connection var ansible_shell_type to sh 16380 1727204144.98700: Set connection var ansible_pipelining to False 16380 1727204144.98702: Set connection var ansible_timeout to 10 16380 1727204144.98705: variable 'ansible_shell_executable' from source: unknown 16380 1727204144.98707: variable 'ansible_connection' from source: unknown 16380 1727204144.98712: variable 'ansible_module_compression' from source: unknown 16380 1727204144.98715: variable 'ansible_shell_type' from source: unknown 16380 1727204144.98717: variable 'ansible_shell_executable' from source: unknown 16380 1727204144.98719: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204144.98721: variable 'ansible_pipelining' from source: unknown 16380 1727204144.98723: variable 'ansible_timeout' from source: unknown 16380 1727204144.98725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204144.99235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204144.99313: variable 'omit' from source: magic vars 16380 1727204144.99327: starting attempt loop 16380 1727204144.99504: running the handler 16380 1727204144.99508: _low_level_execute_command(): starting 16380 1727204144.99513: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204145.01522: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.01715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204145.04206: stdout chunk (state=3): >>>/root <<< 16380 1727204145.04293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204145.04367: stderr chunk (state=3): >>><<< 16380 1727204145.04430: stdout chunk (state=3): >>><<< 16380 1727204145.04465: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204145.04550: _low_level_execute_command(): starting 16380 1727204145.04564: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205 `" && echo ansible-tmp-1727204145.0453393-16811-125139641519205="` echo /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205 `" ) && sleep 0' 16380 1727204145.06035: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204145.06049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204145.06122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.06452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204145.09193: stdout chunk (state=3): >>>ansible-tmp-1727204145.0453393-16811-125139641519205=/root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205 <<< 16380 1727204145.09536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204145.09614: stderr chunk (state=3): >>><<< 16380 1727204145.09626: stdout chunk (state=3): >>><<< 16380 1727204145.09652: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204145.0453393-16811-125139641519205=/root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204145.09695: variable 'ansible_module_compression' from source: unknown 16380 1727204145.09860: ANSIBALLZ: Using generic lock for ansible.legacy.command 16380 1727204145.10196: ANSIBALLZ: Acquiring lock 16380 1727204145.10200: ANSIBALLZ: Lock acquired: 140602939598528 16380 1727204145.10203: ANSIBALLZ: Creating module 16380 1727204145.50969: ANSIBALLZ: Writing module into payload 16380 1727204145.51298: ANSIBALLZ: Writing module 16380 1727204145.51335: ANSIBALLZ: Renaming module 16380 1727204145.51349: ANSIBALLZ: Done creating module 16380 1727204145.51378: variable 'ansible_facts' from source: unknown 16380 1727204145.51703: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py 16380 1727204145.52022: Sending initial data 16380 1727204145.52032: Sent initial data (156 bytes) 16380 1727204145.52644: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204145.52715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204145.52719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.52821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204145.52903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204145.54681: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204145.54712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204145.54915: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpex41wxrt /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py <<< 16380 1727204145.54924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py" <<< 16380 1727204145.54927: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpex41wxrt" to remote "/root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py" <<< 16380 1727204145.58139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204145.58266: stderr chunk (state=3): >>><<< 16380 1727204145.58270: stdout chunk (state=3): >>><<< 16380 1727204145.58272: done transferring module to remote 16380 1727204145.58479: _low_level_execute_command(): starting 16380 1727204145.58483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/ /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py && sleep 0' 16380 1727204145.60098: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204145.60391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.60451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204145.60534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204145.60621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204145.62832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204145.62843: stdout chunk (state=3): >>><<< 16380 1727204145.62935: stderr chunk (state=3): >>><<< 16380 1727204145.62944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204145.62957: _low_level_execute_command(): starting 16380 1727204145.62969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/AnsiballZ_command.py && sleep 0' 16380 1727204145.64307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.64560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204145.64579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204145.64601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204145.64716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204145.83968: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:45.833036", "end": "2024-09-24 14:55:45.836746", "delta": "0:00:00.003710", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16380 1727204145.85746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204145.85947: stderr chunk (state=3): >>><<< 16380 1727204145.85964: stdout chunk (state=3): >>><<< 16380 1727204145.86025: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:45.833036", "end": "2024-09-24 14:55:45.836746", "delta": "0:00:00.003710", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204145.86453: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204145.86501: _low_level_execute_command(): starting 16380 1727204145.86540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204145.0453393-16811-125139641519205/ > /dev/null 2>&1 && sleep 0' 16380 1727204145.88218: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204145.88250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.88258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204145.88267: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204145.88514: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204145.88522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204145.88525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204145.88661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204145.90621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204145.90811: stderr chunk (state=3): >>><<< 16380 1727204145.90819: stdout chunk (state=3): >>><<< 16380 1727204145.90842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204145.90856: handler run complete 16380 1727204145.90882: Evaluated conditional (False): False 16380 1727204145.91003: attempt loop complete, returning result 16380 1727204145.91007: _execute() done 16380 1727204145.91018: dumping result to json 16380 1727204145.91052: done dumping result, returning 16380 1727204145.91064: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-749c-b6eb-0000000000fd] 16380 1727204145.91072: sending task result for task 12b410aa-8751-749c-b6eb-0000000000fd 16380 1727204145.91371: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000fd 16380 1727204145.91374: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003710", "end": "2024-09-24 14:55:45.836746", "rc": 0, "start": "2024-09-24 14:55:45.833036" } STDOUT: bonding_masters eth0 lo 16380 1727204145.91526: no more pending results, returning what we have 16380 1727204145.91530: results queue empty 16380 1727204145.91534: checking for any_errors_fatal 16380 1727204145.91536: done checking for any_errors_fatal 16380 1727204145.91537: checking for max_fail_percentage 16380 1727204145.91539: done checking for max_fail_percentage 16380 1727204145.91540: checking to see if all hosts have failed and the running result is not ok 16380 1727204145.91541: done checking to see if all hosts have failed 16380 1727204145.91542: getting the remaining hosts for this loop 16380 1727204145.91545: done getting the remaining hosts for this loop 16380 1727204145.91549: getting the next task for host managed-node2 16380 1727204145.91557: done getting next task for host managed-node2 16380 1727204145.91560: ^ task is: TASK: Set current_interfaces 16380 1727204145.91565: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204145.91569: getting variables 16380 1727204145.91571: in VariableManager get_vars() 16380 1727204145.91745: Calling all_inventory to load vars for managed-node2 16380 1727204145.91748: Calling groups_inventory to load vars for managed-node2 16380 1727204145.91752: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204145.91764: Calling all_plugins_play to load vars for managed-node2 16380 1727204145.91767: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204145.91771: Calling groups_plugins_play to load vars for managed-node2 16380 1727204145.92306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204145.92874: done with get_vars() 16380 1727204145.92886: done getting variables 16380 1727204145.93016: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.988) 0:00:07.037 ***** 16380 1727204145.93106: entering _queue_task() for managed-node2/set_fact 16380 1727204145.93549: worker is 1 (out of 1 available) 16380 1727204145.93563: exiting _queue_task() for managed-node2/set_fact 16380 1727204145.93577: done queuing things up, now waiting for results queue to drain 16380 1727204145.93579: waiting for pending results... 16380 1727204145.94147: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 16380 1727204145.94294: in run() - task 12b410aa-8751-749c-b6eb-0000000000fe 16380 1727204145.94299: variable 'ansible_search_path' from source: unknown 16380 1727204145.94301: variable 'ansible_search_path' from source: unknown 16380 1727204145.94341: calling self._execute() 16380 1727204145.94506: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204145.94525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204145.94535: variable 'omit' from source: magic vars 16380 1727204145.95636: variable 'ansible_distribution_major_version' from source: facts 16380 1727204145.95665: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204145.95678: variable 'omit' from source: magic vars 16380 1727204145.95773: variable 'omit' from source: magic vars 16380 1727204145.95967: variable '_current_interfaces' from source: set_fact 16380 1727204145.96078: variable 'omit' from source: magic vars 16380 1727204145.96136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204145.96313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204145.96469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204145.96473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204145.96475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204145.96478: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204145.96481: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204145.96516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204145.96897: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204145.96901: Set connection var ansible_shell_executable to /bin/sh 16380 1727204145.96904: Set connection var ansible_connection to ssh 16380 1727204145.96906: Set connection var ansible_shell_type to sh 16380 1727204145.96914: Set connection var ansible_pipelining to False 16380 1727204145.96917: Set connection var ansible_timeout to 10 16380 1727204145.96919: variable 'ansible_shell_executable' from source: unknown 16380 1727204145.96922: variable 'ansible_connection' from source: unknown 16380 1727204145.96926: variable 'ansible_module_compression' from source: unknown 16380 1727204145.96930: variable 'ansible_shell_type' from source: unknown 16380 1727204145.96932: variable 'ansible_shell_executable' from source: unknown 16380 1727204145.96934: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204145.96936: variable 'ansible_pipelining' from source: unknown 16380 1727204145.96939: variable 'ansible_timeout' from source: unknown 16380 1727204145.96941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204145.97179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204145.97199: variable 'omit' from source: magic vars 16380 1727204145.97214: starting attempt loop 16380 1727204145.97222: running the handler 16380 1727204145.97239: handler run complete 16380 1727204145.97257: attempt loop complete, returning result 16380 1727204145.97264: _execute() done 16380 1727204145.97276: dumping result to json 16380 1727204145.97285: done dumping result, returning 16380 1727204145.97299: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-749c-b6eb-0000000000fe] 16380 1727204145.97337: sending task result for task 12b410aa-8751-749c-b6eb-0000000000fe ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 16380 1727204145.97608: no more pending results, returning what we have 16380 1727204145.97613: results queue empty 16380 1727204145.97614: checking for any_errors_fatal 16380 1727204145.97624: done checking for any_errors_fatal 16380 1727204145.97625: checking for max_fail_percentage 16380 1727204145.97627: done checking for max_fail_percentage 16380 1727204145.97628: checking to see if all hosts have failed and the running result is not ok 16380 1727204145.97629: done checking to see if all hosts have failed 16380 1727204145.97630: getting the remaining hosts for this loop 16380 1727204145.97632: done getting the remaining hosts for this loop 16380 1727204145.97636: getting the next task for host managed-node2 16380 1727204145.97645: done getting next task for host managed-node2 16380 1727204145.97649: ^ task is: TASK: Show current_interfaces 16380 1727204145.97652: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204145.97657: getting variables 16380 1727204145.97659: in VariableManager get_vars() 16380 1727204145.97694: Calling all_inventory to load vars for managed-node2 16380 1727204145.97697: Calling groups_inventory to load vars for managed-node2 16380 1727204145.97702: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204145.97718: Calling all_plugins_play to load vars for managed-node2 16380 1727204145.97722: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204145.97726: Calling groups_plugins_play to load vars for managed-node2 16380 1727204145.98153: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000fe 16380 1727204145.98156: WORKER PROCESS EXITING 16380 1727204145.98183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204145.98514: done with get_vars() 16380 1727204145.98532: done getting variables 16380 1727204145.98643: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.055) 0:00:07.093 ***** 16380 1727204145.98686: entering _queue_task() for managed-node2/debug 16380 1727204145.98688: Creating lock for debug 16380 1727204145.99098: worker is 1 (out of 1 available) 16380 1727204145.99119: exiting _queue_task() for managed-node2/debug 16380 1727204145.99130: done queuing things up, now waiting for results queue to drain 16380 1727204145.99132: waiting for pending results... 16380 1727204145.99393: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 16380 1727204145.99545: in run() - task 12b410aa-8751-749c-b6eb-0000000000ef 16380 1727204145.99568: variable 'ansible_search_path' from source: unknown 16380 1727204145.99608: variable 'ansible_search_path' from source: unknown 16380 1727204145.99656: calling self._execute() 16380 1727204145.99738: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204145.99749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204145.99822: variable 'omit' from source: magic vars 16380 1727204146.00793: variable 'ansible_distribution_major_version' from source: facts 16380 1727204146.00848: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204146.00873: variable 'omit' from source: magic vars 16380 1727204146.01212: variable 'omit' from source: magic vars 16380 1727204146.01215: variable 'current_interfaces' from source: set_fact 16380 1727204146.01218: variable 'omit' from source: magic vars 16380 1727204146.01456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204146.01738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204146.01846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204146.02096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.02100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.02102: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204146.02105: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.02107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.02479: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204146.02585: Set connection var ansible_shell_executable to /bin/sh 16380 1727204146.02744: Set connection var ansible_connection to ssh 16380 1727204146.02783: Set connection var ansible_shell_type to sh 16380 1727204146.02830: Set connection var ansible_pipelining to False 16380 1727204146.02884: Set connection var ansible_timeout to 10 16380 1727204146.03088: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.03093: variable 'ansible_connection' from source: unknown 16380 1727204146.03096: variable 'ansible_module_compression' from source: unknown 16380 1727204146.03241: variable 'ansible_shell_type' from source: unknown 16380 1727204146.03244: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.03247: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.03249: variable 'ansible_pipelining' from source: unknown 16380 1727204146.03251: variable 'ansible_timeout' from source: unknown 16380 1727204146.03253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.03832: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204146.03845: variable 'omit' from source: magic vars 16380 1727204146.03872: starting attempt loop 16380 1727204146.03881: running the handler 16380 1727204146.04166: handler run complete 16380 1727204146.04170: attempt loop complete, returning result 16380 1727204146.04172: _execute() done 16380 1727204146.04175: dumping result to json 16380 1727204146.04177: done dumping result, returning 16380 1727204146.04180: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-749c-b6eb-0000000000ef] 16380 1727204146.04241: sending task result for task 12b410aa-8751-749c-b6eb-0000000000ef 16380 1727204146.04569: done sending task result for task 12b410aa-8751-749c-b6eb-0000000000ef 16380 1727204146.04573: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 16380 1727204146.04635: no more pending results, returning what we have 16380 1727204146.04638: results queue empty 16380 1727204146.04639: checking for any_errors_fatal 16380 1727204146.04645: done checking for any_errors_fatal 16380 1727204146.04646: checking for max_fail_percentage 16380 1727204146.04648: done checking for max_fail_percentage 16380 1727204146.04649: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.04650: done checking to see if all hosts have failed 16380 1727204146.04651: getting the remaining hosts for this loop 16380 1727204146.04653: done getting the remaining hosts for this loop 16380 1727204146.04663: getting the next task for host managed-node2 16380 1727204146.04677: done getting next task for host managed-node2 16380 1727204146.04684: ^ task is: TASK: Include the task 'assert_device_absent.yml' 16380 1727204146.04687: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.04694: getting variables 16380 1727204146.04696: in VariableManager get_vars() 16380 1727204146.04737: Calling all_inventory to load vars for managed-node2 16380 1727204146.04742: Calling groups_inventory to load vars for managed-node2 16380 1727204146.04747: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.04766: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.04770: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.04774: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.05776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.06360: done with get_vars() 16380 1727204146.06374: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.078) 0:00:07.172 ***** 16380 1727204146.06551: entering _queue_task() for managed-node2/include_tasks 16380 1727204146.06912: worker is 1 (out of 1 available) 16380 1727204146.06927: exiting _queue_task() for managed-node2/include_tasks 16380 1727204146.06940: done queuing things up, now waiting for results queue to drain 16380 1727204146.06942: waiting for pending results... 16380 1727204146.07312: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 16380 1727204146.07321: in run() - task 12b410aa-8751-749c-b6eb-00000000000d 16380 1727204146.07328: variable 'ansible_search_path' from source: unknown 16380 1727204146.07372: calling self._execute() 16380 1727204146.07471: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.07484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.07502: variable 'omit' from source: magic vars 16380 1727204146.07964: variable 'ansible_distribution_major_version' from source: facts 16380 1727204146.08063: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204146.08066: _execute() done 16380 1727204146.08068: dumping result to json 16380 1727204146.08071: done dumping result, returning 16380 1727204146.08073: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [12b410aa-8751-749c-b6eb-00000000000d] 16380 1727204146.08076: sending task result for task 12b410aa-8751-749c-b6eb-00000000000d 16380 1727204146.08152: done sending task result for task 12b410aa-8751-749c-b6eb-00000000000d 16380 1727204146.08155: WORKER PROCESS EXITING 16380 1727204146.08188: no more pending results, returning what we have 16380 1727204146.08195: in VariableManager get_vars() 16380 1727204146.08236: Calling all_inventory to load vars for managed-node2 16380 1727204146.08240: Calling groups_inventory to load vars for managed-node2 16380 1727204146.08245: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.08261: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.08265: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.08269: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.08788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.09102: done with get_vars() 16380 1727204146.09113: variable 'ansible_search_path' from source: unknown 16380 1727204146.09127: we have included files to process 16380 1727204146.09128: generating all_blocks data 16380 1727204146.09130: done generating all_blocks data 16380 1727204146.09136: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 16380 1727204146.09137: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 16380 1727204146.09140: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 16380 1727204146.09337: in VariableManager get_vars() 16380 1727204146.09356: done with get_vars() 16380 1727204146.09528: done processing included file 16380 1727204146.09531: iterating over new_blocks loaded from include file 16380 1727204146.09533: in VariableManager get_vars() 16380 1727204146.09547: done with get_vars() 16380 1727204146.09549: filtering new block on tags 16380 1727204146.09570: done filtering new block on tags 16380 1727204146.09573: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 16380 1727204146.09579: extending task lists for all hosts with included blocks 16380 1727204146.09799: done extending task lists 16380 1727204146.09801: done processing included files 16380 1727204146.09802: results queue empty 16380 1727204146.09803: checking for any_errors_fatal 16380 1727204146.09806: done checking for any_errors_fatal 16380 1727204146.09813: checking for max_fail_percentage 16380 1727204146.09815: done checking for max_fail_percentage 16380 1727204146.09816: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.09817: done checking to see if all hosts have failed 16380 1727204146.09818: getting the remaining hosts for this loop 16380 1727204146.09819: done getting the remaining hosts for this loop 16380 1727204146.09823: getting the next task for host managed-node2 16380 1727204146.09827: done getting next task for host managed-node2 16380 1727204146.09830: ^ task is: TASK: Include the task 'get_interface_stat.yml' 16380 1727204146.09833: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.09835: getting variables 16380 1727204146.09836: in VariableManager get_vars() 16380 1727204146.09845: Calling all_inventory to load vars for managed-node2 16380 1727204146.09848: Calling groups_inventory to load vars for managed-node2 16380 1727204146.09851: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.09858: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.09861: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.09864: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.10073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.10387: done with get_vars() 16380 1727204146.10400: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.039) 0:00:07.211 ***** 16380 1727204146.10488: entering _queue_task() for managed-node2/include_tasks 16380 1727204146.10906: worker is 1 (out of 1 available) 16380 1727204146.10921: exiting _queue_task() for managed-node2/include_tasks 16380 1727204146.10934: done queuing things up, now waiting for results queue to drain 16380 1727204146.10936: waiting for pending results... 16380 1727204146.11250: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 16380 1727204146.11256: in run() - task 12b410aa-8751-749c-b6eb-000000000119 16380 1727204146.11347: variable 'ansible_search_path' from source: unknown 16380 1727204146.11351: variable 'ansible_search_path' from source: unknown 16380 1727204146.11354: calling self._execute() 16380 1727204146.11405: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.11423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.11440: variable 'omit' from source: magic vars 16380 1727204146.11892: variable 'ansible_distribution_major_version' from source: facts 16380 1727204146.11916: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204146.11928: _execute() done 16380 1727204146.11937: dumping result to json 16380 1727204146.11947: done dumping result, returning 16380 1727204146.11956: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-749c-b6eb-000000000119] 16380 1727204146.11967: sending task result for task 12b410aa-8751-749c-b6eb-000000000119 16380 1727204146.12139: no more pending results, returning what we have 16380 1727204146.12144: in VariableManager get_vars() 16380 1727204146.12180: Calling all_inventory to load vars for managed-node2 16380 1727204146.12184: Calling groups_inventory to load vars for managed-node2 16380 1727204146.12190: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.12207: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.12214: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.12218: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.12655: done sending task result for task 12b410aa-8751-749c-b6eb-000000000119 16380 1727204146.12659: WORKER PROCESS EXITING 16380 1727204146.12687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.13007: done with get_vars() 16380 1727204146.13018: variable 'ansible_search_path' from source: unknown 16380 1727204146.13019: variable 'ansible_search_path' from source: unknown 16380 1727204146.13066: we have included files to process 16380 1727204146.13067: generating all_blocks data 16380 1727204146.13069: done generating all_blocks data 16380 1727204146.13071: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204146.13072: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204146.13075: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204146.13360: done processing included file 16380 1727204146.13363: iterating over new_blocks loaded from include file 16380 1727204146.13365: in VariableManager get_vars() 16380 1727204146.13386: done with get_vars() 16380 1727204146.13388: filtering new block on tags 16380 1727204146.13414: done filtering new block on tags 16380 1727204146.13417: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 16380 1727204146.13423: extending task lists for all hosts with included blocks 16380 1727204146.13561: done extending task lists 16380 1727204146.13563: done processing included files 16380 1727204146.13564: results queue empty 16380 1727204146.13565: checking for any_errors_fatal 16380 1727204146.13568: done checking for any_errors_fatal 16380 1727204146.13569: checking for max_fail_percentage 16380 1727204146.13570: done checking for max_fail_percentage 16380 1727204146.13571: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.13572: done checking to see if all hosts have failed 16380 1727204146.13573: getting the remaining hosts for this loop 16380 1727204146.13574: done getting the remaining hosts for this loop 16380 1727204146.13577: getting the next task for host managed-node2 16380 1727204146.13582: done getting next task for host managed-node2 16380 1727204146.13585: ^ task is: TASK: Get stat for interface {{ interface }} 16380 1727204146.13591: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.13594: getting variables 16380 1727204146.13595: in VariableManager get_vars() 16380 1727204146.13612: Calling all_inventory to load vars for managed-node2 16380 1727204146.13614: Calling groups_inventory to load vars for managed-node2 16380 1727204146.13617: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.13623: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.13626: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.13630: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.13862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.14150: done with get_vars() 16380 1727204146.14161: done getting variables 16380 1727204146.14354: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.039) 0:00:07.250 ***** 16380 1727204146.14395: entering _queue_task() for managed-node2/stat 16380 1727204146.14826: worker is 1 (out of 1 available) 16380 1727204146.14838: exiting _queue_task() for managed-node2/stat 16380 1727204146.14850: done queuing things up, now waiting for results queue to drain 16380 1727204146.14852: waiting for pending results... 16380 1727204146.15036: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 16380 1727204146.15171: in run() - task 12b410aa-8751-749c-b6eb-000000000133 16380 1727204146.15197: variable 'ansible_search_path' from source: unknown 16380 1727204146.15205: variable 'ansible_search_path' from source: unknown 16380 1727204146.15255: calling self._execute() 16380 1727204146.15352: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.15367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.15383: variable 'omit' from source: magic vars 16380 1727204146.15833: variable 'ansible_distribution_major_version' from source: facts 16380 1727204146.15856: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204146.15873: variable 'omit' from source: magic vars 16380 1727204146.15948: variable 'omit' from source: magic vars 16380 1727204146.16067: variable 'interface' from source: set_fact 16380 1727204146.16119: variable 'omit' from source: magic vars 16380 1727204146.16150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204146.16205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204146.16241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204146.16336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.16339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.16342: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204146.16345: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.16347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.16665: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204146.16669: Set connection var ansible_shell_executable to /bin/sh 16380 1727204146.16672: Set connection var ansible_connection to ssh 16380 1727204146.16674: Set connection var ansible_shell_type to sh 16380 1727204146.16677: Set connection var ansible_pipelining to False 16380 1727204146.16679: Set connection var ansible_timeout to 10 16380 1727204146.16682: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.16883: variable 'ansible_connection' from source: unknown 16380 1727204146.16887: variable 'ansible_module_compression' from source: unknown 16380 1727204146.16892: variable 'ansible_shell_type' from source: unknown 16380 1727204146.16895: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.16898: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.16900: variable 'ansible_pipelining' from source: unknown 16380 1727204146.16902: variable 'ansible_timeout' from source: unknown 16380 1727204146.16905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.17279: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204146.17495: variable 'omit' from source: magic vars 16380 1727204146.17499: starting attempt loop 16380 1727204146.17501: running the handler 16380 1727204146.17503: _low_level_execute_command(): starting 16380 1727204146.17505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204146.18908: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204146.18955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204146.18971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204146.18983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.19082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.19232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.19279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.21087: stdout chunk (state=3): >>>/root <<< 16380 1727204146.21246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.21307: stderr chunk (state=3): >>><<< 16380 1727204146.21313: stdout chunk (state=3): >>><<< 16380 1727204146.21345: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.21363: _low_level_execute_command(): starting 16380 1727204146.21372: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666 `" && echo ansible-tmp-1727204146.2134695-16902-245206314334666="` echo /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666 `" ) && sleep 0' 16380 1727204146.22048: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204146.22107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.22169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204146.22191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.22217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.22287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.24520: stdout chunk (state=3): >>>ansible-tmp-1727204146.2134695-16902-245206314334666=/root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666 <<< 16380 1727204146.24541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.24695: stderr chunk (state=3): >>><<< 16380 1727204146.24699: stdout chunk (state=3): >>><<< 16380 1727204146.24702: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204146.2134695-16902-245206314334666=/root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.24705: variable 'ansible_module_compression' from source: unknown 16380 1727204146.24727: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16380 1727204146.24765: variable 'ansible_facts' from source: unknown 16380 1727204146.24867: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py 16380 1727204146.25015: Sending initial data 16380 1727204146.25019: Sent initial data (153 bytes) 16380 1727204146.25794: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204146.25798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.25858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.27599: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204146.27657: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204146.27680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpampqf649 /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py <<< 16380 1727204146.27717: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py" <<< 16380 1727204146.27927: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpampqf649" to remote "/root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py" <<< 16380 1727204146.29941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.30119: stderr chunk (state=3): >>><<< 16380 1727204146.30175: stdout chunk (state=3): >>><<< 16380 1727204146.30179: done transferring module to remote 16380 1727204146.30181: _low_level_execute_command(): starting 16380 1727204146.30195: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/ /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py && sleep 0' 16380 1727204146.30885: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204146.30919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204146.30923: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.31008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.31056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.33060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.33085: stderr chunk (state=3): >>><<< 16380 1727204146.33091: stdout chunk (state=3): >>><<< 16380 1727204146.33113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.33155: _low_level_execute_command(): starting 16380 1727204146.33159: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/AnsiballZ_stat.py && sleep 0' 16380 1727204146.33804: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204146.33821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204146.33849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204146.33867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204146.33913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204146.33964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204146.33967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.34047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.34079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.34144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.51818: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16380 1727204146.53508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204146.53513: stdout chunk (state=3): >>><<< 16380 1727204146.53516: stderr chunk (state=3): >>><<< 16380 1727204146.53597: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204146.54096: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204146.54099: _low_level_execute_command(): starting 16380 1727204146.54104: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204146.2134695-16902-245206314334666/ > /dev/null 2>&1 && sleep 0' 16380 1727204146.55561: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204146.55599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204146.55703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.56177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.56282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.58302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.58316: stdout chunk (state=3): >>><<< 16380 1727204146.58337: stderr chunk (state=3): >>><<< 16380 1727204146.58457: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.58461: handler run complete 16380 1727204146.58482: attempt loop complete, returning result 16380 1727204146.58493: _execute() done 16380 1727204146.58502: dumping result to json 16380 1727204146.58515: done dumping result, returning 16380 1727204146.58530: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000133] 16380 1727204146.58544: sending task result for task 12b410aa-8751-749c-b6eb-000000000133 16380 1727204146.58811: done sending task result for task 12b410aa-8751-749c-b6eb-000000000133 16380 1727204146.58816: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16380 1727204146.58887: no more pending results, returning what we have 16380 1727204146.58892: results queue empty 16380 1727204146.58893: checking for any_errors_fatal 16380 1727204146.58895: done checking for any_errors_fatal 16380 1727204146.58896: checking for max_fail_percentage 16380 1727204146.58898: done checking for max_fail_percentage 16380 1727204146.58898: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.58899: done checking to see if all hosts have failed 16380 1727204146.58901: getting the remaining hosts for this loop 16380 1727204146.58902: done getting the remaining hosts for this loop 16380 1727204146.58907: getting the next task for host managed-node2 16380 1727204146.58915: done getting next task for host managed-node2 16380 1727204146.58917: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 16380 1727204146.58921: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.58924: getting variables 16380 1727204146.58926: in VariableManager get_vars() 16380 1727204146.58955: Calling all_inventory to load vars for managed-node2 16380 1727204146.58958: Calling groups_inventory to load vars for managed-node2 16380 1727204146.58962: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.58974: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.58977: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.58980: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.59292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.59603: done with get_vars() 16380 1727204146.59621: done getting variables 16380 1727204146.59751: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 16380 1727204146.60113: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.457) 0:00:07.708 ***** 16380 1727204146.60149: entering _queue_task() for managed-node2/assert 16380 1727204146.60151: Creating lock for assert 16380 1727204146.60980: worker is 1 (out of 1 available) 16380 1727204146.60995: exiting _queue_task() for managed-node2/assert 16380 1727204146.61245: done queuing things up, now waiting for results queue to drain 16380 1727204146.61248: waiting for pending results... 16380 1727204146.61470: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 16380 1727204146.61678: in run() - task 12b410aa-8751-749c-b6eb-00000000011a 16380 1727204146.61692: variable 'ansible_search_path' from source: unknown 16380 1727204146.61803: variable 'ansible_search_path' from source: unknown 16380 1727204146.61846: calling self._execute() 16380 1727204146.62547: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.62551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.62554: variable 'omit' from source: magic vars 16380 1727204146.62983: variable 'ansible_distribution_major_version' from source: facts 16380 1727204146.62988: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204146.62994: variable 'omit' from source: magic vars 16380 1727204146.63026: variable 'omit' from source: magic vars 16380 1727204146.63244: variable 'interface' from source: set_fact 16380 1727204146.63248: variable 'omit' from source: magic vars 16380 1727204146.63251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204146.63266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204146.63292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204146.63315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.63327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.63416: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204146.63420: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.63423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.63678: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204146.63681: Set connection var ansible_shell_executable to /bin/sh 16380 1727204146.63684: Set connection var ansible_connection to ssh 16380 1727204146.63686: Set connection var ansible_shell_type to sh 16380 1727204146.63691: Set connection var ansible_pipelining to False 16380 1727204146.63693: Set connection var ansible_timeout to 10 16380 1727204146.63696: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.63699: variable 'ansible_connection' from source: unknown 16380 1727204146.63701: variable 'ansible_module_compression' from source: unknown 16380 1727204146.63703: variable 'ansible_shell_type' from source: unknown 16380 1727204146.63705: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.63707: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.63712: variable 'ansible_pipelining' from source: unknown 16380 1727204146.63714: variable 'ansible_timeout' from source: unknown 16380 1727204146.63717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.63916: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204146.63940: variable 'omit' from source: magic vars 16380 1727204146.63944: starting attempt loop 16380 1727204146.63947: running the handler 16380 1727204146.64198: variable 'interface_stat' from source: set_fact 16380 1727204146.64202: Evaluated conditional (not interface_stat.stat.exists): True 16380 1727204146.64204: handler run complete 16380 1727204146.64207: attempt loop complete, returning result 16380 1727204146.64212: _execute() done 16380 1727204146.64214: dumping result to json 16380 1727204146.64217: done dumping result, returning 16380 1727204146.64304: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [12b410aa-8751-749c-b6eb-00000000011a] 16380 1727204146.64311: sending task result for task 12b410aa-8751-749c-b6eb-00000000011a 16380 1727204146.64375: done sending task result for task 12b410aa-8751-749c-b6eb-00000000011a 16380 1727204146.64378: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204146.64773: no more pending results, returning what we have 16380 1727204146.64776: results queue empty 16380 1727204146.64777: checking for any_errors_fatal 16380 1727204146.64781: done checking for any_errors_fatal 16380 1727204146.64782: checking for max_fail_percentage 16380 1727204146.64784: done checking for max_fail_percentage 16380 1727204146.64785: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.64786: done checking to see if all hosts have failed 16380 1727204146.64787: getting the remaining hosts for this loop 16380 1727204146.64788: done getting the remaining hosts for this loop 16380 1727204146.64796: getting the next task for host managed-node2 16380 1727204146.64804: done getting next task for host managed-node2 16380 1727204146.64806: ^ task is: TASK: meta (flush_handlers) 16380 1727204146.64808: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.64814: getting variables 16380 1727204146.64815: in VariableManager get_vars() 16380 1727204146.64840: Calling all_inventory to load vars for managed-node2 16380 1727204146.64846: Calling groups_inventory to load vars for managed-node2 16380 1727204146.64850: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.64873: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.64882: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.64887: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.65107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.65416: done with get_vars() 16380 1727204146.65431: done getting variables 16380 1727204146.65506: in VariableManager get_vars() 16380 1727204146.65519: Calling all_inventory to load vars for managed-node2 16380 1727204146.65521: Calling groups_inventory to load vars for managed-node2 16380 1727204146.65525: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.65534: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.65538: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.65542: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.65761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.66275: done with get_vars() 16380 1727204146.66291: done queuing things up, now waiting for results queue to drain 16380 1727204146.66293: results queue empty 16380 1727204146.66294: checking for any_errors_fatal 16380 1727204146.66297: done checking for any_errors_fatal 16380 1727204146.66298: checking for max_fail_percentage 16380 1727204146.66299: done checking for max_fail_percentage 16380 1727204146.66300: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.66301: done checking to see if all hosts have failed 16380 1727204146.66307: getting the remaining hosts for this loop 16380 1727204146.66311: done getting the remaining hosts for this loop 16380 1727204146.66318: getting the next task for host managed-node2 16380 1727204146.66323: done getting next task for host managed-node2 16380 1727204146.66325: ^ task is: TASK: meta (flush_handlers) 16380 1727204146.66326: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.66330: getting variables 16380 1727204146.66331: in VariableManager get_vars() 16380 1727204146.66341: Calling all_inventory to load vars for managed-node2 16380 1727204146.66344: Calling groups_inventory to load vars for managed-node2 16380 1727204146.66347: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.66352: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.66355: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.66359: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.66605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.67356: done with get_vars() 16380 1727204146.67366: done getting variables 16380 1727204146.67536: in VariableManager get_vars() 16380 1727204146.67547: Calling all_inventory to load vars for managed-node2 16380 1727204146.67549: Calling groups_inventory to load vars for managed-node2 16380 1727204146.67553: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.67558: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.67561: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.67564: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.68077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.68691: done with get_vars() 16380 1727204146.68705: done queuing things up, now waiting for results queue to drain 16380 1727204146.68707: results queue empty 16380 1727204146.68709: checking for any_errors_fatal 16380 1727204146.68710: done checking for any_errors_fatal 16380 1727204146.68711: checking for max_fail_percentage 16380 1727204146.68712: done checking for max_fail_percentage 16380 1727204146.68713: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.68714: done checking to see if all hosts have failed 16380 1727204146.68715: getting the remaining hosts for this loop 16380 1727204146.68716: done getting the remaining hosts for this loop 16380 1727204146.68720: getting the next task for host managed-node2 16380 1727204146.68723: done getting next task for host managed-node2 16380 1727204146.68724: ^ task is: None 16380 1727204146.68726: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.68728: done queuing things up, now waiting for results queue to drain 16380 1727204146.68729: results queue empty 16380 1727204146.68730: checking for any_errors_fatal 16380 1727204146.68731: done checking for any_errors_fatal 16380 1727204146.68732: checking for max_fail_percentage 16380 1727204146.68733: done checking for max_fail_percentage 16380 1727204146.68734: checking to see if all hosts have failed and the running result is not ok 16380 1727204146.68735: done checking to see if all hosts have failed 16380 1727204146.68737: getting the next task for host managed-node2 16380 1727204146.68740: done getting next task for host managed-node2 16380 1727204146.68742: ^ task is: None 16380 1727204146.68744: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.68871: in VariableManager get_vars() 16380 1727204146.68897: done with get_vars() 16380 1727204146.68909: in VariableManager get_vars() 16380 1727204146.68927: done with get_vars() 16380 1727204146.68933: variable 'omit' from source: magic vars 16380 1727204146.68969: in VariableManager get_vars() 16380 1727204146.68985: done with get_vars() 16380 1727204146.69024: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 16380 1727204146.70412: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204146.70545: getting the remaining hosts for this loop 16380 1727204146.70547: done getting the remaining hosts for this loop 16380 1727204146.70550: getting the next task for host managed-node2 16380 1727204146.70553: done getting next task for host managed-node2 16380 1727204146.70555: ^ task is: TASK: Gathering Facts 16380 1727204146.70557: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204146.70559: getting variables 16380 1727204146.70560: in VariableManager get_vars() 16380 1727204146.70574: Calling all_inventory to load vars for managed-node2 16380 1727204146.70576: Calling groups_inventory to load vars for managed-node2 16380 1727204146.70579: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204146.70585: Calling all_plugins_play to load vars for managed-node2 16380 1727204146.70588: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204146.70596: Calling groups_plugins_play to load vars for managed-node2 16380 1727204146.70938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204146.71217: done with get_vars() 16380 1727204146.71226: done getting variables 16380 1727204146.71274: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.111) 0:00:07.819 ***** 16380 1727204146.71307: entering _queue_task() for managed-node2/gather_facts 16380 1727204146.71863: worker is 1 (out of 1 available) 16380 1727204146.71877: exiting _queue_task() for managed-node2/gather_facts 16380 1727204146.71892: done queuing things up, now waiting for results queue to drain 16380 1727204146.71894: waiting for pending results... 16380 1727204146.72415: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204146.72420: in run() - task 12b410aa-8751-749c-b6eb-00000000014c 16380 1727204146.72424: variable 'ansible_search_path' from source: unknown 16380 1727204146.72434: calling self._execute() 16380 1727204146.72541: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.72554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.72571: variable 'omit' from source: magic vars 16380 1727204146.73186: variable 'ansible_distribution_major_version' from source: facts 16380 1727204146.73227: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204146.73335: variable 'omit' from source: magic vars 16380 1727204146.73372: variable 'omit' from source: magic vars 16380 1727204146.73553: variable 'omit' from source: magic vars 16380 1727204146.73635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204146.73758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204146.73861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204146.73893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.73968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204146.74014: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204146.74041: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.74050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.74200: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204146.74216: Set connection var ansible_shell_executable to /bin/sh 16380 1727204146.74232: Set connection var ansible_connection to ssh 16380 1727204146.74245: Set connection var ansible_shell_type to sh 16380 1727204146.74256: Set connection var ansible_pipelining to False 16380 1727204146.74270: Set connection var ansible_timeout to 10 16380 1727204146.74310: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.74320: variable 'ansible_connection' from source: unknown 16380 1727204146.74502: variable 'ansible_module_compression' from source: unknown 16380 1727204146.74506: variable 'ansible_shell_type' from source: unknown 16380 1727204146.74509: variable 'ansible_shell_executable' from source: unknown 16380 1727204146.74511: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204146.74513: variable 'ansible_pipelining' from source: unknown 16380 1727204146.74515: variable 'ansible_timeout' from source: unknown 16380 1727204146.74517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204146.74714: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204146.74816: variable 'omit' from source: magic vars 16380 1727204146.74903: starting attempt loop 16380 1727204146.74913: running the handler 16380 1727204146.74994: variable 'ansible_facts' from source: unknown 16380 1727204146.75002: _low_level_execute_command(): starting 16380 1727204146.75006: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204146.76018: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.76151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.76201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.76258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.78556: stdout chunk (state=3): >>>/root <<< 16380 1727204146.78559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.78562: stdout chunk (state=3): >>><<< 16380 1727204146.78565: stderr chunk (state=3): >>><<< 16380 1727204146.78568: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.78570: _low_level_execute_command(): starting 16380 1727204146.78573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438 `" && echo ansible-tmp-1727204146.784448-16988-257723837402438="` echo /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438 `" ) && sleep 0' 16380 1727204146.80315: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.80899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204146.80903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.80905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.80908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.82824: stdout chunk (state=3): >>>ansible-tmp-1727204146.784448-16988-257723837402438=/root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438 <<< 16380 1727204146.83040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.83437: stderr chunk (state=3): >>><<< 16380 1727204146.83441: stdout chunk (state=3): >>><<< 16380 1727204146.83467: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204146.784448-16988-257723837402438=/root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.83513: variable 'ansible_module_compression' from source: unknown 16380 1727204146.84013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204146.84017: variable 'ansible_facts' from source: unknown 16380 1727204146.84618: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py 16380 1727204146.84915: Sending initial data 16380 1727204146.84926: Sent initial data (153 bytes) 16380 1727204146.86280: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204146.86313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204146.86331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204146.86376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204146.86481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.86604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204146.86659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.86798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.88468: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204146.88553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204146.88624: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpqkj4ovao /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py <<< 16380 1727204146.88628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py" <<< 16380 1727204146.88716: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpqkj4ovao" to remote "/root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py" <<< 16380 1727204146.93783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.93787: stderr chunk (state=3): >>><<< 16380 1727204146.93793: stdout chunk (state=3): >>><<< 16380 1727204146.93796: done transferring module to remote 16380 1727204146.93799: _low_level_execute_command(): starting 16380 1727204146.93801: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/ /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py && sleep 0' 16380 1727204146.95207: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.95312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.95523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.95546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204146.97734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204146.97738: stdout chunk (state=3): >>><<< 16380 1727204146.97740: stderr chunk (state=3): >>><<< 16380 1727204146.97768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204146.97878: _low_level_execute_command(): starting 16380 1727204146.97882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/AnsiballZ_setup.py && sleep 0' 16380 1727204146.99175: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204146.99247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204146.99403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204146.99515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204146.99596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204147.68331: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2819, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 898, "free": 2819}, "nocache": {"free": 3446, "used": 271}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 651, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156054016, "block_size": 4096, "block_total": 64479564, "block_available": 61317396, "block_used": 3162168, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.55419921875, "5m": 0.548828125, "15m": 0.34423828125}, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "47", "epoch": "1727204147", "epoch_int": "1727204147", "date": "2024-09-24", "time": "14:55:47", "iso8601_micro": "2024-09-24T18:55:47.680500Z", "iso8601": "2024-09-24T18:55:47Z", "iso8601_basic": "20240924T145547680500", "iso8601_basic_short": "20240924T145547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204147.70813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204147.70827: stdout chunk (state=3): >>><<< 16380 1727204147.70840: stderr chunk (state=3): >>><<< 16380 1727204147.70876: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2819, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 898, "free": 2819}, "nocache": {"free": 3446, "used": 271}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 651, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156054016, "block_size": 4096, "block_total": 64479564, "block_available": 61317396, "block_used": 3162168, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.55419921875, "5m": 0.548828125, "15m": 0.34423828125}, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "47", "epoch": "1727204147", "epoch_int": "1727204147", "date": "2024-09-24", "time": "14:55:47", "iso8601_micro": "2024-09-24T18:55:47.680500Z", "iso8601": "2024-09-24T18:55:47Z", "iso8601_basic": "20240924T145547680500", "iso8601_basic_short": "20240924T145547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204147.71679: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204147.71683: _low_level_execute_command(): starting 16380 1727204147.71686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204146.784448-16988-257723837402438/ > /dev/null 2>&1 && sleep 0' 16380 1727204147.73020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204147.73037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204147.73124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204147.73251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204147.73350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204147.73375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204147.73455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204147.75515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204147.75650: stderr chunk (state=3): >>><<< 16380 1727204147.75656: stdout chunk (state=3): >>><<< 16380 1727204147.75696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204147.75699: handler run complete 16380 1727204147.76221: variable 'ansible_facts' from source: unknown 16380 1727204147.76428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.77594: variable 'ansible_facts' from source: unknown 16380 1727204147.77743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.78088: attempt loop complete, returning result 16380 1727204147.78102: _execute() done 16380 1727204147.78111: dumping result to json 16380 1727204147.78149: done dumping result, returning 16380 1727204147.78379: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-00000000014c] 16380 1727204147.78383: sending task result for task 12b410aa-8751-749c-b6eb-00000000014c 16380 1727204147.79374: done sending task result for task 12b410aa-8751-749c-b6eb-00000000014c 16380 1727204147.79378: WORKER PROCESS EXITING ok: [managed-node2] 16380 1727204147.79948: no more pending results, returning what we have 16380 1727204147.79951: results queue empty 16380 1727204147.79953: checking for any_errors_fatal 16380 1727204147.79954: done checking for any_errors_fatal 16380 1727204147.79955: checking for max_fail_percentage 16380 1727204147.79957: done checking for max_fail_percentage 16380 1727204147.79958: checking to see if all hosts have failed and the running result is not ok 16380 1727204147.79959: done checking to see if all hosts have failed 16380 1727204147.79960: getting the remaining hosts for this loop 16380 1727204147.79962: done getting the remaining hosts for this loop 16380 1727204147.79966: getting the next task for host managed-node2 16380 1727204147.79972: done getting next task for host managed-node2 16380 1727204147.79974: ^ task is: TASK: meta (flush_handlers) 16380 1727204147.79976: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204147.79980: getting variables 16380 1727204147.79982: in VariableManager get_vars() 16380 1727204147.80123: Calling all_inventory to load vars for managed-node2 16380 1727204147.80127: Calling groups_inventory to load vars for managed-node2 16380 1727204147.80130: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204147.80256: Calling all_plugins_play to load vars for managed-node2 16380 1727204147.80261: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204147.80266: Calling groups_plugins_play to load vars for managed-node2 16380 1727204147.80647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.81281: done with get_vars() 16380 1727204147.81347: done getting variables 16380 1727204147.81497: in VariableManager get_vars() 16380 1727204147.81516: Calling all_inventory to load vars for managed-node2 16380 1727204147.81519: Calling groups_inventory to load vars for managed-node2 16380 1727204147.81522: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204147.81528: Calling all_plugins_play to load vars for managed-node2 16380 1727204147.81531: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204147.81535: Calling groups_plugins_play to load vars for managed-node2 16380 1727204147.81977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.82684: done with get_vars() 16380 1727204147.82703: done queuing things up, now waiting for results queue to drain 16380 1727204147.82706: results queue empty 16380 1727204147.82707: checking for any_errors_fatal 16380 1727204147.82714: done checking for any_errors_fatal 16380 1727204147.82715: checking for max_fail_percentage 16380 1727204147.82716: done checking for max_fail_percentage 16380 1727204147.82717: checking to see if all hosts have failed and the running result is not ok 16380 1727204147.82722: done checking to see if all hosts have failed 16380 1727204147.82723: getting the remaining hosts for this loop 16380 1727204147.82724: done getting the remaining hosts for this loop 16380 1727204147.82728: getting the next task for host managed-node2 16380 1727204147.82732: done getting next task for host managed-node2 16380 1727204147.82735: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16380 1727204147.82737: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204147.82796: getting variables 16380 1727204147.82798: in VariableManager get_vars() 16380 1727204147.82817: Calling all_inventory to load vars for managed-node2 16380 1727204147.82819: Calling groups_inventory to load vars for managed-node2 16380 1727204147.82822: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204147.82826: Calling all_plugins_play to load vars for managed-node2 16380 1727204147.82829: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204147.82832: Calling groups_plugins_play to load vars for managed-node2 16380 1727204147.83280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.84028: done with get_vars() 16380 1727204147.84038: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:47 -0400 (0:00:01.129) 0:00:08.949 ***** 16380 1727204147.84257: entering _queue_task() for managed-node2/include_tasks 16380 1727204147.84957: worker is 1 (out of 1 available) 16380 1727204147.84972: exiting _queue_task() for managed-node2/include_tasks 16380 1727204147.84986: done queuing things up, now waiting for results queue to drain 16380 1727204147.84988: waiting for pending results... 16380 1727204147.85367: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16380 1727204147.86045: in run() - task 12b410aa-8751-749c-b6eb-000000000014 16380 1727204147.86050: variable 'ansible_search_path' from source: unknown 16380 1727204147.86054: variable 'ansible_search_path' from source: unknown 16380 1727204147.86057: calling self._execute() 16380 1727204147.86696: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204147.87297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204147.87302: variable 'omit' from source: magic vars 16380 1727204147.87913: variable 'ansible_distribution_major_version' from source: facts 16380 1727204147.87933: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204147.87947: _execute() done 16380 1727204147.87958: dumping result to json 16380 1727204147.87966: done dumping result, returning 16380 1727204147.88095: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-749c-b6eb-000000000014] 16380 1727204147.88099: sending task result for task 12b410aa-8751-749c-b6eb-000000000014 16380 1727204147.88239: no more pending results, returning what we have 16380 1727204147.88245: in VariableManager get_vars() 16380 1727204147.88294: Calling all_inventory to load vars for managed-node2 16380 1727204147.88297: Calling groups_inventory to load vars for managed-node2 16380 1727204147.88300: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204147.88316: Calling all_plugins_play to load vars for managed-node2 16380 1727204147.88319: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204147.88324: Calling groups_plugins_play to load vars for managed-node2 16380 1727204147.88941: done sending task result for task 12b410aa-8751-749c-b6eb-000000000014 16380 1727204147.88945: WORKER PROCESS EXITING 16380 1727204147.88973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.89620: done with get_vars() 16380 1727204147.89629: variable 'ansible_search_path' from source: unknown 16380 1727204147.89631: variable 'ansible_search_path' from source: unknown 16380 1727204147.89662: we have included files to process 16380 1727204147.89664: generating all_blocks data 16380 1727204147.89665: done generating all_blocks data 16380 1727204147.89666: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204147.89668: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204147.89671: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204147.91608: done processing included file 16380 1727204147.91610: iterating over new_blocks loaded from include file 16380 1727204147.91612: in VariableManager get_vars() 16380 1727204147.91637: done with get_vars() 16380 1727204147.91639: filtering new block on tags 16380 1727204147.91660: done filtering new block on tags 16380 1727204147.91663: in VariableManager get_vars() 16380 1727204147.91686: done with get_vars() 16380 1727204147.91688: filtering new block on tags 16380 1727204147.91714: done filtering new block on tags 16380 1727204147.91717: in VariableManager get_vars() 16380 1727204147.91741: done with get_vars() 16380 1727204147.91743: filtering new block on tags 16380 1727204147.91763: done filtering new block on tags 16380 1727204147.91766: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16380 1727204147.91772: extending task lists for all hosts with included blocks 16380 1727204147.93057: done extending task lists 16380 1727204147.93058: done processing included files 16380 1727204147.93059: results queue empty 16380 1727204147.93060: checking for any_errors_fatal 16380 1727204147.93062: done checking for any_errors_fatal 16380 1727204147.93063: checking for max_fail_percentage 16380 1727204147.93064: done checking for max_fail_percentage 16380 1727204147.93065: checking to see if all hosts have failed and the running result is not ok 16380 1727204147.93066: done checking to see if all hosts have failed 16380 1727204147.93067: getting the remaining hosts for this loop 16380 1727204147.93069: done getting the remaining hosts for this loop 16380 1727204147.93072: getting the next task for host managed-node2 16380 1727204147.93077: done getting next task for host managed-node2 16380 1727204147.93080: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16380 1727204147.93083: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204147.93153: getting variables 16380 1727204147.93155: in VariableManager get_vars() 16380 1727204147.93173: Calling all_inventory to load vars for managed-node2 16380 1727204147.93176: Calling groups_inventory to load vars for managed-node2 16380 1727204147.93178: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204147.93185: Calling all_plugins_play to load vars for managed-node2 16380 1727204147.93188: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204147.93194: Calling groups_plugins_play to load vars for managed-node2 16380 1727204147.93656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204147.94262: done with get_vars() 16380 1727204147.94273: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.101) 0:00:09.050 ***** 16380 1727204147.94426: entering _queue_task() for managed-node2/setup 16380 1727204147.95087: worker is 1 (out of 1 available) 16380 1727204147.95206: exiting _queue_task() for managed-node2/setup 16380 1727204147.95220: done queuing things up, now waiting for results queue to drain 16380 1727204147.95222: waiting for pending results... 16380 1727204147.95776: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16380 1727204147.96434: in run() - task 12b410aa-8751-749c-b6eb-00000000018d 16380 1727204147.96446: variable 'ansible_search_path' from source: unknown 16380 1727204147.96450: variable 'ansible_search_path' from source: unknown 16380 1727204147.96490: calling self._execute() 16380 1727204147.96760: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204147.96766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204147.97055: variable 'omit' from source: magic vars 16380 1727204147.97884: variable 'ansible_distribution_major_version' from source: facts 16380 1727204147.97905: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204147.98586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204148.04205: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204148.04420: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204148.04477: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204148.04602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204148.04767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204148.04920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204148.05020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204148.05122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204148.05302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204148.05361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204148.05529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204148.05638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204148.05706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204148.05833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204148.05862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204148.06295: variable '__network_required_facts' from source: role '' defaults 16380 1727204148.06331: variable 'ansible_facts' from source: unknown 16380 1727204148.06694: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16380 1727204148.06698: when evaluation is False, skipping this task 16380 1727204148.06700: _execute() done 16380 1727204148.06702: dumping result to json 16380 1727204148.06705: done dumping result, returning 16380 1727204148.06719: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-749c-b6eb-00000000018d] 16380 1727204148.06730: sending task result for task 12b410aa-8751-749c-b6eb-00000000018d skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204148.07160: no more pending results, returning what we have 16380 1727204148.07165: results queue empty 16380 1727204148.07166: checking for any_errors_fatal 16380 1727204148.07168: done checking for any_errors_fatal 16380 1727204148.07169: checking for max_fail_percentage 16380 1727204148.07170: done checking for max_fail_percentage 16380 1727204148.07171: checking to see if all hosts have failed and the running result is not ok 16380 1727204148.07172: done checking to see if all hosts have failed 16380 1727204148.07173: getting the remaining hosts for this loop 16380 1727204148.07175: done getting the remaining hosts for this loop 16380 1727204148.07181: getting the next task for host managed-node2 16380 1727204148.07194: done getting next task for host managed-node2 16380 1727204148.07199: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16380 1727204148.07202: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204148.07221: getting variables 16380 1727204148.07223: in VariableManager get_vars() 16380 1727204148.07269: Calling all_inventory to load vars for managed-node2 16380 1727204148.07272: Calling groups_inventory to load vars for managed-node2 16380 1727204148.07275: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204148.07288: Calling all_plugins_play to load vars for managed-node2 16380 1727204148.07546: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204148.07552: Calling groups_plugins_play to load vars for managed-node2 16380 1727204148.08167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204148.08920: done sending task result for task 12b410aa-8751-749c-b6eb-00000000018d 16380 1727204148.08924: WORKER PROCESS EXITING 16380 1727204148.08978: done with get_vars() 16380 1727204148.08993: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.148) 0:00:09.199 ***** 16380 1727204148.09254: entering _queue_task() for managed-node2/stat 16380 1727204148.09891: worker is 1 (out of 1 available) 16380 1727204148.09907: exiting _queue_task() for managed-node2/stat 16380 1727204148.10302: done queuing things up, now waiting for results queue to drain 16380 1727204148.10305: waiting for pending results... 16380 1727204148.10499: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16380 1727204148.10984: in run() - task 12b410aa-8751-749c-b6eb-00000000018f 16380 1727204148.10990: variable 'ansible_search_path' from source: unknown 16380 1727204148.10993: variable 'ansible_search_path' from source: unknown 16380 1727204148.10998: calling self._execute() 16380 1727204148.11315: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204148.11330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204148.11349: variable 'omit' from source: magic vars 16380 1727204148.11847: variable 'ansible_distribution_major_version' from source: facts 16380 1727204148.11867: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204148.12087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204148.12465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204148.12544: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204148.12596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204148.12699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204148.12751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204148.12787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204148.12837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204148.12866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204148.12974: variable '__network_is_ostree' from source: set_fact 16380 1727204148.12987: Evaluated conditional (not __network_is_ostree is defined): False 16380 1727204148.12997: when evaluation is False, skipping this task 16380 1727204148.13006: _execute() done 16380 1727204148.13014: dumping result to json 16380 1727204148.13028: done dumping result, returning 16380 1727204148.13040: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-749c-b6eb-00000000018f] 16380 1727204148.13053: sending task result for task 12b410aa-8751-749c-b6eb-00000000018f skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16380 1727204148.13241: no more pending results, returning what we have 16380 1727204148.13244: results queue empty 16380 1727204148.13245: checking for any_errors_fatal 16380 1727204148.13252: done checking for any_errors_fatal 16380 1727204148.13253: checking for max_fail_percentage 16380 1727204148.13255: done checking for max_fail_percentage 16380 1727204148.13255: checking to see if all hosts have failed and the running result is not ok 16380 1727204148.13256: done checking to see if all hosts have failed 16380 1727204148.13257: getting the remaining hosts for this loop 16380 1727204148.13259: done getting the remaining hosts for this loop 16380 1727204148.13263: getting the next task for host managed-node2 16380 1727204148.13358: done getting next task for host managed-node2 16380 1727204148.13363: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16380 1727204148.13366: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204148.13502: getting variables 16380 1727204148.13504: in VariableManager get_vars() 16380 1727204148.13552: Calling all_inventory to load vars for managed-node2 16380 1727204148.13736: Calling groups_inventory to load vars for managed-node2 16380 1727204148.13741: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204148.13752: Calling all_plugins_play to load vars for managed-node2 16380 1727204148.13756: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204148.13760: Calling groups_plugins_play to load vars for managed-node2 16380 1727204148.14261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204148.15243: done sending task result for task 12b410aa-8751-749c-b6eb-00000000018f 16380 1727204148.15247: WORKER PROCESS EXITING 16380 1727204148.15317: done with get_vars() 16380 1727204148.15331: done getting variables 16380 1727204148.15633: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.064) 0:00:09.264 ***** 16380 1727204148.15812: entering _queue_task() for managed-node2/set_fact 16380 1727204148.16485: worker is 1 (out of 1 available) 16380 1727204148.16500: exiting _queue_task() for managed-node2/set_fact 16380 1727204148.16517: done queuing things up, now waiting for results queue to drain 16380 1727204148.16520: waiting for pending results... 16380 1727204148.17110: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16380 1727204148.17584: in run() - task 12b410aa-8751-749c-b6eb-000000000190 16380 1727204148.17612: variable 'ansible_search_path' from source: unknown 16380 1727204148.17620: variable 'ansible_search_path' from source: unknown 16380 1727204148.17710: calling self._execute() 16380 1727204148.17899: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204148.18095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204148.18100: variable 'omit' from source: magic vars 16380 1727204148.18522: variable 'ansible_distribution_major_version' from source: facts 16380 1727204148.18547: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204148.18821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204148.19307: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204148.19373: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204148.19427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204148.19477: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204148.19586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204148.19627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204148.19668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204148.19705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204148.19811: variable '__network_is_ostree' from source: set_fact 16380 1727204148.19824: Evaluated conditional (not __network_is_ostree is defined): False 16380 1727204148.19831: when evaluation is False, skipping this task 16380 1727204148.19844: _execute() done 16380 1727204148.19851: dumping result to json 16380 1727204148.19860: done dumping result, returning 16380 1727204148.19878: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-749c-b6eb-000000000190] 16380 1727204148.19892: sending task result for task 12b410aa-8751-749c-b6eb-000000000190 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16380 1727204148.20145: no more pending results, returning what we have 16380 1727204148.20149: results queue empty 16380 1727204148.20150: checking for any_errors_fatal 16380 1727204148.20157: done checking for any_errors_fatal 16380 1727204148.20158: checking for max_fail_percentage 16380 1727204148.20160: done checking for max_fail_percentage 16380 1727204148.20161: checking to see if all hosts have failed and the running result is not ok 16380 1727204148.20162: done checking to see if all hosts have failed 16380 1727204148.20163: getting the remaining hosts for this loop 16380 1727204148.20165: done getting the remaining hosts for this loop 16380 1727204148.20170: getting the next task for host managed-node2 16380 1727204148.20180: done getting next task for host managed-node2 16380 1727204148.20184: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16380 1727204148.20188: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204148.20406: done sending task result for task 12b410aa-8751-749c-b6eb-000000000190 16380 1727204148.20414: WORKER PROCESS EXITING 16380 1727204148.20425: getting variables 16380 1727204148.20428: in VariableManager get_vars() 16380 1727204148.20474: Calling all_inventory to load vars for managed-node2 16380 1727204148.20479: Calling groups_inventory to load vars for managed-node2 16380 1727204148.20482: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204148.20602: Calling all_plugins_play to load vars for managed-node2 16380 1727204148.20608: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204148.20613: Calling groups_plugins_play to load vars for managed-node2 16380 1727204148.21007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204148.21324: done with get_vars() 16380 1727204148.21337: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.056) 0:00:09.321 ***** 16380 1727204148.21467: entering _queue_task() for managed-node2/service_facts 16380 1727204148.21469: Creating lock for service_facts 16380 1727204148.22018: worker is 1 (out of 1 available) 16380 1727204148.22033: exiting _queue_task() for managed-node2/service_facts 16380 1727204148.22046: done queuing things up, now waiting for results queue to drain 16380 1727204148.22048: waiting for pending results... 16380 1727204148.22607: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16380 1727204148.22861: in run() - task 12b410aa-8751-749c-b6eb-000000000192 16380 1727204148.23007: variable 'ansible_search_path' from source: unknown 16380 1727204148.23012: variable 'ansible_search_path' from source: unknown 16380 1727204148.23086: calling self._execute() 16380 1727204148.23241: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204148.23254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204148.23271: variable 'omit' from source: magic vars 16380 1727204148.23765: variable 'ansible_distribution_major_version' from source: facts 16380 1727204148.23784: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204148.23800: variable 'omit' from source: magic vars 16380 1727204148.23945: variable 'omit' from source: magic vars 16380 1727204148.23950: variable 'omit' from source: magic vars 16380 1727204148.23993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204148.24041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204148.24162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204148.24165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204148.24169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204148.24172: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204148.24174: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204148.24176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204148.24301: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204148.24316: Set connection var ansible_shell_executable to /bin/sh 16380 1727204148.24328: Set connection var ansible_connection to ssh 16380 1727204148.24338: Set connection var ansible_shell_type to sh 16380 1727204148.24379: Set connection var ansible_pipelining to False 16380 1727204148.24382: Set connection var ansible_timeout to 10 16380 1727204148.24402: variable 'ansible_shell_executable' from source: unknown 16380 1727204148.24411: variable 'ansible_connection' from source: unknown 16380 1727204148.24420: variable 'ansible_module_compression' from source: unknown 16380 1727204148.24427: variable 'ansible_shell_type' from source: unknown 16380 1727204148.24434: variable 'ansible_shell_executable' from source: unknown 16380 1727204148.24489: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204148.24494: variable 'ansible_pipelining' from source: unknown 16380 1727204148.24500: variable 'ansible_timeout' from source: unknown 16380 1727204148.24502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204148.24721: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204148.24742: variable 'omit' from source: magic vars 16380 1727204148.24753: starting attempt loop 16380 1727204148.24760: running the handler 16380 1727204148.24779: _low_level_execute_command(): starting 16380 1727204148.24793: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204148.25692: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204148.25711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204148.25755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204148.25792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204148.27575: stdout chunk (state=3): >>>/root <<< 16380 1727204148.27718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204148.27744: stderr chunk (state=3): >>><<< 16380 1727204148.27748: stdout chunk (state=3): >>><<< 16380 1727204148.27767: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204148.27779: _low_level_execute_command(): starting 16380 1727204148.27786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184 `" && echo ansible-tmp-1727204148.2776713-17196-137920938077184="` echo /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184 `" ) && sleep 0' 16380 1727204148.28246: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204148.28251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204148.28254: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204148.28266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204148.28311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204148.28315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204148.28360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204148.30391: stdout chunk (state=3): >>>ansible-tmp-1727204148.2776713-17196-137920938077184=/root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184 <<< 16380 1727204148.30603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204148.30606: stdout chunk (state=3): >>><<< 16380 1727204148.30611: stderr chunk (state=3): >>><<< 16380 1727204148.30614: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204148.2776713-17196-137920938077184=/root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204148.30657: variable 'ansible_module_compression' from source: unknown 16380 1727204148.30712: ANSIBALLZ: Using lock for service_facts 16380 1727204148.30716: ANSIBALLZ: Acquiring lock 16380 1727204148.30722: ANSIBALLZ: Lock acquired: 140602936643216 16380 1727204148.30730: ANSIBALLZ: Creating module 16380 1727204148.44977: ANSIBALLZ: Writing module into payload 16380 1727204148.45108: ANSIBALLZ: Writing module 16380 1727204148.45126: ANSIBALLZ: Renaming module 16380 1727204148.45129: ANSIBALLZ: Done creating module 16380 1727204148.45151: variable 'ansible_facts' from source: unknown 16380 1727204148.45199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py 16380 1727204148.45340: Sending initial data 16380 1727204148.45343: Sent initial data (162 bytes) 16380 1727204148.45940: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204148.45945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204148.45948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204148.45985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204148.45994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204148.45996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204148.46056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204148.47801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 16380 1727204148.47809: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204148.47838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204148.47876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp9wnjs3h5 /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py <<< 16380 1727204148.47886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py" <<< 16380 1727204148.47911: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp9wnjs3h5" to remote "/root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py" <<< 16380 1727204148.48703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204148.48764: stderr chunk (state=3): >>><<< 16380 1727204148.48768: stdout chunk (state=3): >>><<< 16380 1727204148.48790: done transferring module to remote 16380 1727204148.48803: _low_level_execute_command(): starting 16380 1727204148.48808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/ /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py && sleep 0' 16380 1727204148.49513: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204148.49715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204148.49740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204148.51756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204148.51775: stderr chunk (state=3): >>><<< 16380 1727204148.51784: stdout chunk (state=3): >>><<< 16380 1727204148.51810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204148.51820: _low_level_execute_command(): starting 16380 1727204148.51831: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/AnsiballZ_service_facts.py && sleep 0' 16380 1727204148.52479: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204148.52497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204148.52520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204148.52543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204148.52567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204148.52585: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204148.52698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204148.52713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204148.52872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204150.55129: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16380 1727204150.56711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204150.56788: stderr chunk (state=3): >>><<< 16380 1727204150.56802: stdout chunk (state=3): >>><<< 16380 1727204150.57196: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204150.58561: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204150.58580: _low_level_execute_command(): starting 16380 1727204150.58598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204148.2776713-17196-137920938077184/ > /dev/null 2>&1 && sleep 0' 16380 1727204150.59394: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204150.59417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204150.59660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204150.59828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204150.61857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204150.61869: stdout chunk (state=3): >>><<< 16380 1727204150.61886: stderr chunk (state=3): >>><<< 16380 1727204150.62195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204150.62200: handler run complete 16380 1727204150.62514: variable 'ansible_facts' from source: unknown 16380 1727204150.63066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204150.64840: variable 'ansible_facts' from source: unknown 16380 1727204150.65276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204150.66041: attempt loop complete, returning result 16380 1727204150.66103: _execute() done 16380 1727204150.66116: dumping result to json 16380 1727204150.66307: done dumping result, returning 16380 1727204150.66373: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-749c-b6eb-000000000192] 16380 1727204150.66384: sending task result for task 12b410aa-8751-749c-b6eb-000000000192 16380 1727204150.69452: done sending task result for task 12b410aa-8751-749c-b6eb-000000000192 16380 1727204150.69456: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204150.69558: no more pending results, returning what we have 16380 1727204150.69561: results queue empty 16380 1727204150.69562: checking for any_errors_fatal 16380 1727204150.69565: done checking for any_errors_fatal 16380 1727204150.69566: checking for max_fail_percentage 16380 1727204150.69567: done checking for max_fail_percentage 16380 1727204150.69568: checking to see if all hosts have failed and the running result is not ok 16380 1727204150.69569: done checking to see if all hosts have failed 16380 1727204150.69570: getting the remaining hosts for this loop 16380 1727204150.69571: done getting the remaining hosts for this loop 16380 1727204150.69575: getting the next task for host managed-node2 16380 1727204150.69581: done getting next task for host managed-node2 16380 1727204150.69586: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16380 1727204150.69589: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204150.69605: getting variables 16380 1727204150.69607: in VariableManager get_vars() 16380 1727204150.69643: Calling all_inventory to load vars for managed-node2 16380 1727204150.69647: Calling groups_inventory to load vars for managed-node2 16380 1727204150.69650: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204150.69660: Calling all_plugins_play to load vars for managed-node2 16380 1727204150.69664: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204150.69668: Calling groups_plugins_play to load vars for managed-node2 16380 1727204150.70769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204150.71984: done with get_vars() 16380 1727204150.72006: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:50 -0400 (0:00:02.506) 0:00:11.827 ***** 16380 1727204150.72126: entering _queue_task() for managed-node2/package_facts 16380 1727204150.72128: Creating lock for package_facts 16380 1727204150.72621: worker is 1 (out of 1 available) 16380 1727204150.72632: exiting _queue_task() for managed-node2/package_facts 16380 1727204150.72645: done queuing things up, now waiting for results queue to drain 16380 1727204150.72647: waiting for pending results... 16380 1727204150.72886: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16380 1727204150.72952: in run() - task 12b410aa-8751-749c-b6eb-000000000193 16380 1727204150.72981: variable 'ansible_search_path' from source: unknown 16380 1727204150.72999: variable 'ansible_search_path' from source: unknown 16380 1727204150.73050: calling self._execute() 16380 1727204150.73198: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204150.73201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204150.73207: variable 'omit' from source: magic vars 16380 1727204150.74288: variable 'ansible_distribution_major_version' from source: facts 16380 1727204150.74298: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204150.74301: variable 'omit' from source: magic vars 16380 1727204150.74339: variable 'omit' from source: magic vars 16380 1727204150.74387: variable 'omit' from source: magic vars 16380 1727204150.74551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204150.74601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204150.74662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204150.74795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204150.74799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204150.74837: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204150.74904: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204150.74917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204150.75166: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204150.75313: Set connection var ansible_shell_executable to /bin/sh 16380 1727204150.75330: Set connection var ansible_connection to ssh 16380 1727204150.75345: Set connection var ansible_shell_type to sh 16380 1727204150.75358: Set connection var ansible_pipelining to False 16380 1727204150.75374: Set connection var ansible_timeout to 10 16380 1727204150.75465: variable 'ansible_shell_executable' from source: unknown 16380 1727204150.75504: variable 'ansible_connection' from source: unknown 16380 1727204150.75524: variable 'ansible_module_compression' from source: unknown 16380 1727204150.75566: variable 'ansible_shell_type' from source: unknown 16380 1727204150.75577: variable 'ansible_shell_executable' from source: unknown 16380 1727204150.75595: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204150.75613: variable 'ansible_pipelining' from source: unknown 16380 1727204150.75698: variable 'ansible_timeout' from source: unknown 16380 1727204150.75702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204150.75925: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204150.75955: variable 'omit' from source: magic vars 16380 1727204150.75967: starting attempt loop 16380 1727204150.75975: running the handler 16380 1727204150.76001: _low_level_execute_command(): starting 16380 1727204150.76018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204150.76871: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204150.76927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204150.76964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204150.77006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204150.77047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204150.79145: stdout chunk (state=3): >>>/root <<< 16380 1727204150.79152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204150.79374: stderr chunk (state=3): >>><<< 16380 1727204150.79380: stdout chunk (state=3): >>><<< 16380 1727204150.79383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204150.79386: _low_level_execute_command(): starting 16380 1727204150.79513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289 `" && echo ansible-tmp-1727204150.7932982-17397-104447143757289="` echo /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289 `" ) && sleep 0' 16380 1727204150.80758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204150.80864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204150.81007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204150.81200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204150.81221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204150.81307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204150.83397: stdout chunk (state=3): >>>ansible-tmp-1727204150.7932982-17397-104447143757289=/root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289 <<< 16380 1727204150.83609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204150.83698: stderr chunk (state=3): >>><<< 16380 1727204150.83720: stdout chunk (state=3): >>><<< 16380 1727204150.83947: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204150.7932982-17397-104447143757289=/root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204150.83951: variable 'ansible_module_compression' from source: unknown 16380 1727204150.84177: ANSIBALLZ: Using lock for package_facts 16380 1727204150.84351: ANSIBALLZ: Acquiring lock 16380 1727204150.84355: ANSIBALLZ: Lock acquired: 140602933512336 16380 1727204150.84357: ANSIBALLZ: Creating module 16380 1727204151.52146: ANSIBALLZ: Writing module into payload 16380 1727204151.52470: ANSIBALLZ: Writing module 16380 1727204151.52473: ANSIBALLZ: Renaming module 16380 1727204151.52476: ANSIBALLZ: Done creating module 16380 1727204151.52479: variable 'ansible_facts' from source: unknown 16380 1727204151.52855: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py 16380 1727204151.53702: Sending initial data 16380 1727204151.53710: Sent initial data (162 bytes) 16380 1727204151.54460: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204151.54527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204151.54653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204151.54932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204151.55025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204151.56858: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204151.56945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204151.57294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp2j3uk74z /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py <<< 16380 1727204151.57298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py" <<< 16380 1727204151.57416: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 16380 1727204151.57434: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp2j3uk74z" to remote "/root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py" <<< 16380 1727204151.63999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204151.64319: stderr chunk (state=3): >>><<< 16380 1727204151.64323: stdout chunk (state=3): >>><<< 16380 1727204151.64357: done transferring module to remote 16380 1727204151.64367: _low_level_execute_command(): starting 16380 1727204151.64373: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/ /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py && sleep 0' 16380 1727204151.66384: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204151.66753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204151.66932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204151.67162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204151.69045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204151.69225: stderr chunk (state=3): >>><<< 16380 1727204151.69231: stdout chunk (state=3): >>><<< 16380 1727204151.69256: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204151.69263: _low_level_execute_command(): starting 16380 1727204151.69266: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/AnsiballZ_package_facts.py && sleep 0' 16380 1727204151.71090: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204151.71097: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204151.71100: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16380 1727204151.71102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204151.71105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204151.71524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204151.71528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204151.71733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204152.35855: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 16380 1727204152.35907: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 16380 1727204152.35934: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 16380 1727204152.35953: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 16380 1727204152.35981: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 16380 1727204152.36033: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 16380 1727204152.36039: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 16380 1727204152.36176: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 16380 1727204152.36192: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16380 1727204152.38354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204152.38360: stdout chunk (state=3): >>><<< 16380 1727204152.38362: stderr chunk (state=3): >>><<< 16380 1727204152.38504: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204152.47329: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204152.47335: _low_level_execute_command(): starting 16380 1727204152.47376: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204150.7932982-17397-104447143757289/ > /dev/null 2>&1 && sleep 0' 16380 1727204152.48469: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204152.48846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204152.48850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204152.48853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204152.48855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204152.51180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204152.51184: stdout chunk (state=3): >>><<< 16380 1727204152.51187: stderr chunk (state=3): >>><<< 16380 1727204152.51193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204152.51196: handler run complete 16380 1727204152.54063: variable 'ansible_facts' from source: unknown 16380 1727204152.56615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204152.64483: variable 'ansible_facts' from source: unknown 16380 1727204152.66768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204152.70185: attempt loop complete, returning result 16380 1727204152.70226: _execute() done 16380 1727204152.70230: dumping result to json 16380 1727204152.71208: done dumping result, returning 16380 1727204152.71217: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-749c-b6eb-000000000193] 16380 1727204152.71336: sending task result for task 12b410aa-8751-749c-b6eb-000000000193 16380 1727204152.87560: done sending task result for task 12b410aa-8751-749c-b6eb-000000000193 16380 1727204152.87569: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204152.87665: no more pending results, returning what we have 16380 1727204152.87668: results queue empty 16380 1727204152.87669: checking for any_errors_fatal 16380 1727204152.87679: done checking for any_errors_fatal 16380 1727204152.87680: checking for max_fail_percentage 16380 1727204152.87683: done checking for max_fail_percentage 16380 1727204152.87684: checking to see if all hosts have failed and the running result is not ok 16380 1727204152.87685: done checking to see if all hosts have failed 16380 1727204152.87686: getting the remaining hosts for this loop 16380 1727204152.87687: done getting the remaining hosts for this loop 16380 1727204152.87693: getting the next task for host managed-node2 16380 1727204152.87702: done getting next task for host managed-node2 16380 1727204152.87706: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16380 1727204152.87709: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204152.87720: getting variables 16380 1727204152.87722: in VariableManager get_vars() 16380 1727204152.87756: Calling all_inventory to load vars for managed-node2 16380 1727204152.87759: Calling groups_inventory to load vars for managed-node2 16380 1727204152.87762: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204152.87772: Calling all_plugins_play to load vars for managed-node2 16380 1727204152.87775: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204152.87789: Calling groups_plugins_play to load vars for managed-node2 16380 1727204152.93888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.00995: done with get_vars() 16380 1727204153.01041: done getting variables 16380 1727204153.01122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:53 -0400 (0:00:02.290) 0:00:14.118 ***** 16380 1727204153.01149: entering _queue_task() for managed-node2/debug 16380 1727204153.01424: worker is 1 (out of 1 available) 16380 1727204153.01439: exiting _queue_task() for managed-node2/debug 16380 1727204153.01453: done queuing things up, now waiting for results queue to drain 16380 1727204153.01456: waiting for pending results... 16380 1727204153.01752: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16380 1727204153.01821: in run() - task 12b410aa-8751-749c-b6eb-000000000015 16380 1727204153.01826: variable 'ansible_search_path' from source: unknown 16380 1727204153.01964: variable 'ansible_search_path' from source: unknown 16380 1727204153.01969: calling self._execute() 16380 1727204153.02230: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.02235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.02238: variable 'omit' from source: magic vars 16380 1727204153.03039: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.03052: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.03067: variable 'omit' from source: magic vars 16380 1727204153.03160: variable 'omit' from source: magic vars 16380 1727204153.03401: variable 'network_provider' from source: set_fact 16380 1727204153.03423: variable 'omit' from source: magic vars 16380 1727204153.03469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204153.03536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204153.03574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204153.03597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204153.03614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204153.03649: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204153.03653: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.03658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.03792: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204153.03798: Set connection var ansible_shell_executable to /bin/sh 16380 1727204153.03801: Set connection var ansible_connection to ssh 16380 1727204153.03838: Set connection var ansible_shell_type to sh 16380 1727204153.03842: Set connection var ansible_pipelining to False 16380 1727204153.03844: Set connection var ansible_timeout to 10 16380 1727204153.03854: variable 'ansible_shell_executable' from source: unknown 16380 1727204153.03858: variable 'ansible_connection' from source: unknown 16380 1727204153.03862: variable 'ansible_module_compression' from source: unknown 16380 1727204153.03865: variable 'ansible_shell_type' from source: unknown 16380 1727204153.03900: variable 'ansible_shell_executable' from source: unknown 16380 1727204153.03903: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.03905: variable 'ansible_pipelining' from source: unknown 16380 1727204153.03911: variable 'ansible_timeout' from source: unknown 16380 1727204153.03914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.04122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204153.04126: variable 'omit' from source: magic vars 16380 1727204153.04129: starting attempt loop 16380 1727204153.04132: running the handler 16380 1727204153.04153: handler run complete 16380 1727204153.04178: attempt loop complete, returning result 16380 1727204153.04181: _execute() done 16380 1727204153.04184: dumping result to json 16380 1727204153.04194: done dumping result, returning 16380 1727204153.04230: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-749c-b6eb-000000000015] 16380 1727204153.04234: sending task result for task 12b410aa-8751-749c-b6eb-000000000015 ok: [managed-node2] => {} MSG: Using network provider: nm 16380 1727204153.04569: no more pending results, returning what we have 16380 1727204153.04572: results queue empty 16380 1727204153.04574: checking for any_errors_fatal 16380 1727204153.04581: done checking for any_errors_fatal 16380 1727204153.04583: checking for max_fail_percentage 16380 1727204153.04584: done checking for max_fail_percentage 16380 1727204153.04585: checking to see if all hosts have failed and the running result is not ok 16380 1727204153.04586: done checking to see if all hosts have failed 16380 1727204153.04587: getting the remaining hosts for this loop 16380 1727204153.04591: done getting the remaining hosts for this loop 16380 1727204153.04595: getting the next task for host managed-node2 16380 1727204153.04601: done getting next task for host managed-node2 16380 1727204153.04607: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16380 1727204153.04611: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204153.04623: getting variables 16380 1727204153.04625: in VariableManager get_vars() 16380 1727204153.04659: Calling all_inventory to load vars for managed-node2 16380 1727204153.04666: Calling groups_inventory to load vars for managed-node2 16380 1727204153.04671: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204153.04682: Calling all_plugins_play to load vars for managed-node2 16380 1727204153.04688: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204153.04695: Calling groups_plugins_play to load vars for managed-node2 16380 1727204153.04708: done sending task result for task 12b410aa-8751-749c-b6eb-000000000015 16380 1727204153.04711: WORKER PROCESS EXITING 16380 1727204153.07286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.10870: done with get_vars() 16380 1727204153.10995: done getting variables 16380 1727204153.11176: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.100) 0:00:14.218 ***** 16380 1727204153.11225: entering _queue_task() for managed-node2/fail 16380 1727204153.11227: Creating lock for fail 16380 1727204153.11638: worker is 1 (out of 1 available) 16380 1727204153.11658: exiting _queue_task() for managed-node2/fail 16380 1727204153.11672: done queuing things up, now waiting for results queue to drain 16380 1727204153.11674: waiting for pending results... 16380 1727204153.11987: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16380 1727204153.12197: in run() - task 12b410aa-8751-749c-b6eb-000000000016 16380 1727204153.12201: variable 'ansible_search_path' from source: unknown 16380 1727204153.12204: variable 'ansible_search_path' from source: unknown 16380 1727204153.12207: calling self._execute() 16380 1727204153.12366: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.12370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.12373: variable 'omit' from source: magic vars 16380 1727204153.12955: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.12972: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.13177: variable 'network_state' from source: role '' defaults 16380 1727204153.13276: Evaluated conditional (network_state != {}): False 16380 1727204153.13282: when evaluation is False, skipping this task 16380 1727204153.13285: _execute() done 16380 1727204153.13287: dumping result to json 16380 1727204153.13289: done dumping result, returning 16380 1727204153.13293: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-749c-b6eb-000000000016] 16380 1727204153.13297: sending task result for task 12b410aa-8751-749c-b6eb-000000000016 16380 1727204153.13381: done sending task result for task 12b410aa-8751-749c-b6eb-000000000016 16380 1727204153.13385: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204153.13454: no more pending results, returning what we have 16380 1727204153.13458: results queue empty 16380 1727204153.13460: checking for any_errors_fatal 16380 1727204153.13468: done checking for any_errors_fatal 16380 1727204153.13469: checking for max_fail_percentage 16380 1727204153.13470: done checking for max_fail_percentage 16380 1727204153.13471: checking to see if all hosts have failed and the running result is not ok 16380 1727204153.13473: done checking to see if all hosts have failed 16380 1727204153.13474: getting the remaining hosts for this loop 16380 1727204153.13476: done getting the remaining hosts for this loop 16380 1727204153.13482: getting the next task for host managed-node2 16380 1727204153.13492: done getting next task for host managed-node2 16380 1727204153.13497: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16380 1727204153.13502: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204153.13523: getting variables 16380 1727204153.13525: in VariableManager get_vars() 16380 1727204153.13573: Calling all_inventory to load vars for managed-node2 16380 1727204153.13577: Calling groups_inventory to load vars for managed-node2 16380 1727204153.13581: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204153.14253: Calling all_plugins_play to load vars for managed-node2 16380 1727204153.14265: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204153.14292: Calling groups_plugins_play to load vars for managed-node2 16380 1727204153.17529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.20787: done with get_vars() 16380 1727204153.20823: done getting variables 16380 1727204153.20879: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.096) 0:00:14.315 ***** 16380 1727204153.20936: entering _queue_task() for managed-node2/fail 16380 1727204153.21305: worker is 1 (out of 1 available) 16380 1727204153.21326: exiting _queue_task() for managed-node2/fail 16380 1727204153.21341: done queuing things up, now waiting for results queue to drain 16380 1727204153.21344: waiting for pending results... 16380 1727204153.21724: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16380 1727204153.21761: in run() - task 12b410aa-8751-749c-b6eb-000000000017 16380 1727204153.21767: variable 'ansible_search_path' from source: unknown 16380 1727204153.21771: variable 'ansible_search_path' from source: unknown 16380 1727204153.21819: calling self._execute() 16380 1727204153.21965: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.21968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.21972: variable 'omit' from source: magic vars 16380 1727204153.22480: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.22484: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.22670: variable 'network_state' from source: role '' defaults 16380 1727204153.22674: Evaluated conditional (network_state != {}): False 16380 1727204153.22677: when evaluation is False, skipping this task 16380 1727204153.22680: _execute() done 16380 1727204153.22685: dumping result to json 16380 1727204153.22773: done dumping result, returning 16380 1727204153.22782: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-749c-b6eb-000000000017] 16380 1727204153.22786: sending task result for task 12b410aa-8751-749c-b6eb-000000000017 16380 1727204153.22898: done sending task result for task 12b410aa-8751-749c-b6eb-000000000017 16380 1727204153.22905: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204153.22987: no more pending results, returning what we have 16380 1727204153.22992: results queue empty 16380 1727204153.22994: checking for any_errors_fatal 16380 1727204153.23003: done checking for any_errors_fatal 16380 1727204153.23008: checking for max_fail_percentage 16380 1727204153.23010: done checking for max_fail_percentage 16380 1727204153.23011: checking to see if all hosts have failed and the running result is not ok 16380 1727204153.23012: done checking to see if all hosts have failed 16380 1727204153.23013: getting the remaining hosts for this loop 16380 1727204153.23015: done getting the remaining hosts for this loop 16380 1727204153.23018: getting the next task for host managed-node2 16380 1727204153.23024: done getting next task for host managed-node2 16380 1727204153.23029: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16380 1727204153.23034: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204153.23050: getting variables 16380 1727204153.23052: in VariableManager get_vars() 16380 1727204153.23134: Calling all_inventory to load vars for managed-node2 16380 1727204153.23138: Calling groups_inventory to load vars for managed-node2 16380 1727204153.23143: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204153.23155: Calling all_plugins_play to load vars for managed-node2 16380 1727204153.23159: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204153.23164: Calling groups_plugins_play to load vars for managed-node2 16380 1727204153.24676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.27392: done with get_vars() 16380 1727204153.27442: done getting variables 16380 1727204153.27515: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.066) 0:00:14.382 ***** 16380 1727204153.27553: entering _queue_task() for managed-node2/fail 16380 1727204153.28336: worker is 1 (out of 1 available) 16380 1727204153.28352: exiting _queue_task() for managed-node2/fail 16380 1727204153.28371: done queuing things up, now waiting for results queue to drain 16380 1727204153.28374: waiting for pending results... 16380 1727204153.29077: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16380 1727204153.29082: in run() - task 12b410aa-8751-749c-b6eb-000000000018 16380 1727204153.29085: variable 'ansible_search_path' from source: unknown 16380 1727204153.29088: variable 'ansible_search_path' from source: unknown 16380 1727204153.29153: calling self._execute() 16380 1727204153.29415: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.29420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.29424: variable 'omit' from source: magic vars 16380 1727204153.29968: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.29985: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.30267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204153.33491: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204153.33582: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204153.33642: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204153.33671: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204153.33798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204153.33802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.33835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.33865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.33917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.33933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.34045: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.34063: Evaluated conditional (ansible_distribution_major_version | int > 9): True 16380 1727204153.34261: variable 'ansible_distribution' from source: facts 16380 1727204153.34273: variable '__network_rh_distros' from source: role '' defaults 16380 1727204153.34288: Evaluated conditional (ansible_distribution in __network_rh_distros): False 16380 1727204153.34300: when evaluation is False, skipping this task 16380 1727204153.34308: _execute() done 16380 1727204153.34343: dumping result to json 16380 1727204153.34354: done dumping result, returning 16380 1727204153.34364: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-749c-b6eb-000000000018] 16380 1727204153.34368: sending task result for task 12b410aa-8751-749c-b6eb-000000000018 16380 1727204153.34531: done sending task result for task 12b410aa-8751-749c-b6eb-000000000018 16380 1727204153.34535: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 16380 1727204153.34598: no more pending results, returning what we have 16380 1727204153.34602: results queue empty 16380 1727204153.34604: checking for any_errors_fatal 16380 1727204153.34611: done checking for any_errors_fatal 16380 1727204153.34612: checking for max_fail_percentage 16380 1727204153.34614: done checking for max_fail_percentage 16380 1727204153.34615: checking to see if all hosts have failed and the running result is not ok 16380 1727204153.34616: done checking to see if all hosts have failed 16380 1727204153.34617: getting the remaining hosts for this loop 16380 1727204153.34619: done getting the remaining hosts for this loop 16380 1727204153.34624: getting the next task for host managed-node2 16380 1727204153.34633: done getting next task for host managed-node2 16380 1727204153.34638: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16380 1727204153.34641: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204153.34659: getting variables 16380 1727204153.34661: in VariableManager get_vars() 16380 1727204153.34714: Calling all_inventory to load vars for managed-node2 16380 1727204153.34718: Calling groups_inventory to load vars for managed-node2 16380 1727204153.34721: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204153.34735: Calling all_plugins_play to load vars for managed-node2 16380 1727204153.34739: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204153.34744: Calling groups_plugins_play to load vars for managed-node2 16380 1727204153.38383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.41250: done with get_vars() 16380 1727204153.41315: done getting variables 16380 1727204153.41501: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.139) 0:00:14.522 ***** 16380 1727204153.41555: entering _queue_task() for managed-node2/dnf 16380 1727204153.42156: worker is 1 (out of 1 available) 16380 1727204153.42170: exiting _queue_task() for managed-node2/dnf 16380 1727204153.42183: done queuing things up, now waiting for results queue to drain 16380 1727204153.42186: waiting for pending results... 16380 1727204153.42656: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16380 1727204153.42727: in run() - task 12b410aa-8751-749c-b6eb-000000000019 16380 1727204153.42895: variable 'ansible_search_path' from source: unknown 16380 1727204153.42900: variable 'ansible_search_path' from source: unknown 16380 1727204153.42908: calling self._execute() 16380 1727204153.42913: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.42916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.42988: variable 'omit' from source: magic vars 16380 1727204153.43936: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.44093: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.44284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204153.50020: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204153.50099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204153.50234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204153.50237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204153.50240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204153.50359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.50402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.50439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.50500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.50524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.50697: variable 'ansible_distribution' from source: facts 16380 1727204153.50709: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.50723: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16380 1727204153.50891: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204153.51156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.51347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.51448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.51466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.51492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.51552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.51586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.51632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.51731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.51764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.51842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.51935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.51940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.52020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.52076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.52302: variable 'network_connections' from source: play vars 16380 1727204153.52336: variable 'interface' from source: set_fact 16380 1727204153.52438: variable 'interface' from source: set_fact 16380 1727204153.52551: variable 'interface' from source: set_fact 16380 1727204153.52555: variable 'interface' from source: set_fact 16380 1727204153.52645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204153.52905: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204153.52954: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204153.52999: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204153.53056: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204153.53154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204153.53185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204153.53298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.53344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204153.53581: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204153.53785: variable 'network_connections' from source: play vars 16380 1727204153.53832: variable 'interface' from source: set_fact 16380 1727204153.53931: variable 'interface' from source: set_fact 16380 1727204153.53948: variable 'interface' from source: set_fact 16380 1727204153.54036: variable 'interface' from source: set_fact 16380 1727204153.54077: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204153.54085: when evaluation is False, skipping this task 16380 1727204153.54096: _execute() done 16380 1727204153.54234: dumping result to json 16380 1727204153.54247: done dumping result, returning 16380 1727204153.54262: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000019] 16380 1727204153.54275: sending task result for task 12b410aa-8751-749c-b6eb-000000000019 16380 1727204153.54429: done sending task result for task 12b410aa-8751-749c-b6eb-000000000019 16380 1727204153.54433: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204153.54546: no more pending results, returning what we have 16380 1727204153.54550: results queue empty 16380 1727204153.54551: checking for any_errors_fatal 16380 1727204153.54557: done checking for any_errors_fatal 16380 1727204153.54558: checking for max_fail_percentage 16380 1727204153.54560: done checking for max_fail_percentage 16380 1727204153.54561: checking to see if all hosts have failed and the running result is not ok 16380 1727204153.54562: done checking to see if all hosts have failed 16380 1727204153.54563: getting the remaining hosts for this loop 16380 1727204153.54565: done getting the remaining hosts for this loop 16380 1727204153.54569: getting the next task for host managed-node2 16380 1727204153.54576: done getting next task for host managed-node2 16380 1727204153.54580: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16380 1727204153.54583: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204153.54600: getting variables 16380 1727204153.54602: in VariableManager get_vars() 16380 1727204153.54644: Calling all_inventory to load vars for managed-node2 16380 1727204153.54647: Calling groups_inventory to load vars for managed-node2 16380 1727204153.54649: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204153.54659: Calling all_plugins_play to load vars for managed-node2 16380 1727204153.54662: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204153.54666: Calling groups_plugins_play to load vars for managed-node2 16380 1727204153.61320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.70262: done with get_vars() 16380 1727204153.70321: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16380 1727204153.70657: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.291) 0:00:14.813 ***** 16380 1727204153.70742: entering _queue_task() for managed-node2/yum 16380 1727204153.70745: Creating lock for yum 16380 1727204153.71385: worker is 1 (out of 1 available) 16380 1727204153.71401: exiting _queue_task() for managed-node2/yum 16380 1727204153.71415: done queuing things up, now waiting for results queue to drain 16380 1727204153.71417: waiting for pending results... 16380 1727204153.71831: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16380 1727204153.71836: in run() - task 12b410aa-8751-749c-b6eb-00000000001a 16380 1727204153.71840: variable 'ansible_search_path' from source: unknown 16380 1727204153.71842: variable 'ansible_search_path' from source: unknown 16380 1727204153.71865: calling self._execute() 16380 1727204153.71981: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.71998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.72037: variable 'omit' from source: magic vars 16380 1727204153.72711: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.72729: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.73040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204153.77354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204153.77436: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204153.77535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204153.77546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204153.77586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204153.77695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.77740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.77785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.77849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.77907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.78015: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.78079: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16380 1727204153.78083: when evaluation is False, skipping this task 16380 1727204153.78086: _execute() done 16380 1727204153.78090: dumping result to json 16380 1727204153.78094: done dumping result, returning 16380 1727204153.78097: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-00000000001a] 16380 1727204153.78100: sending task result for task 12b410aa-8751-749c-b6eb-00000000001a skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16380 1727204153.78280: no more pending results, returning what we have 16380 1727204153.78284: results queue empty 16380 1727204153.78285: checking for any_errors_fatal 16380 1727204153.78293: done checking for any_errors_fatal 16380 1727204153.78294: checking for max_fail_percentage 16380 1727204153.78296: done checking for max_fail_percentage 16380 1727204153.78297: checking to see if all hosts have failed and the running result is not ok 16380 1727204153.78298: done checking to see if all hosts have failed 16380 1727204153.78299: getting the remaining hosts for this loop 16380 1727204153.78301: done getting the remaining hosts for this loop 16380 1727204153.78305: getting the next task for host managed-node2 16380 1727204153.78316: done getting next task for host managed-node2 16380 1727204153.78321: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16380 1727204153.78323: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204153.78339: getting variables 16380 1727204153.78343: in VariableManager get_vars() 16380 1727204153.78568: Calling all_inventory to load vars for managed-node2 16380 1727204153.78572: Calling groups_inventory to load vars for managed-node2 16380 1727204153.78575: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204153.78588: Calling all_plugins_play to load vars for managed-node2 16380 1727204153.78594: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204153.78599: Calling groups_plugins_play to load vars for managed-node2 16380 1727204153.79243: done sending task result for task 12b410aa-8751-749c-b6eb-00000000001a 16380 1727204153.79246: WORKER PROCESS EXITING 16380 1727204153.84724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204153.88710: done with get_vars() 16380 1727204153.88765: done getting variables 16380 1727204153.88844: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.181) 0:00:14.995 ***** 16380 1727204153.88885: entering _queue_task() for managed-node2/fail 16380 1727204153.89409: worker is 1 (out of 1 available) 16380 1727204153.89421: exiting _queue_task() for managed-node2/fail 16380 1727204153.89433: done queuing things up, now waiting for results queue to drain 16380 1727204153.89435: waiting for pending results... 16380 1727204153.89679: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16380 1727204153.89754: in run() - task 12b410aa-8751-749c-b6eb-00000000001b 16380 1727204153.89994: variable 'ansible_search_path' from source: unknown 16380 1727204153.89999: variable 'ansible_search_path' from source: unknown 16380 1727204153.90002: calling self._execute() 16380 1727204153.90149: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204153.90222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204153.90247: variable 'omit' from source: magic vars 16380 1727204153.91287: variable 'ansible_distribution_major_version' from source: facts 16380 1727204153.91312: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204153.91725: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204153.92404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204153.96655: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204153.96757: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204153.96823: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204153.96882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204153.96933: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204153.97042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.97098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.97133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.97208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.97259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.97306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.97351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.97396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.97477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.97491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.97585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204153.97602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204153.97644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.97753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204153.97756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204153.98159: variable 'network_connections' from source: play vars 16380 1727204153.98180: variable 'interface' from source: set_fact 16380 1727204153.98295: variable 'interface' from source: set_fact 16380 1727204153.98351: variable 'interface' from source: set_fact 16380 1727204153.98394: variable 'interface' from source: set_fact 16380 1727204153.98493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204153.98751: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204153.98806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204153.98858: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204153.98952: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204153.98964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204153.98997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204153.99036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204153.99079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204153.99158: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204153.99510: variable 'network_connections' from source: play vars 16380 1727204153.99521: variable 'interface' from source: set_fact 16380 1727204153.99618: variable 'interface' from source: set_fact 16380 1727204153.99796: variable 'interface' from source: set_fact 16380 1727204153.99799: variable 'interface' from source: set_fact 16380 1727204153.99802: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204153.99804: when evaluation is False, skipping this task 16380 1727204153.99806: _execute() done 16380 1727204153.99808: dumping result to json 16380 1727204153.99810: done dumping result, returning 16380 1727204153.99812: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-00000000001b] 16380 1727204153.99824: sending task result for task 12b410aa-8751-749c-b6eb-00000000001b skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204154.00151: no more pending results, returning what we have 16380 1727204154.00163: results queue empty 16380 1727204154.00164: checking for any_errors_fatal 16380 1727204154.00173: done checking for any_errors_fatal 16380 1727204154.00173: checking for max_fail_percentage 16380 1727204154.00175: done checking for max_fail_percentage 16380 1727204154.00176: checking to see if all hosts have failed and the running result is not ok 16380 1727204154.00177: done checking to see if all hosts have failed 16380 1727204154.00178: getting the remaining hosts for this loop 16380 1727204154.00180: done getting the remaining hosts for this loop 16380 1727204154.00186: getting the next task for host managed-node2 16380 1727204154.00275: done getting next task for host managed-node2 16380 1727204154.00280: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16380 1727204154.00283: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204154.00303: getting variables 16380 1727204154.00305: in VariableManager get_vars() 16380 1727204154.00598: Calling all_inventory to load vars for managed-node2 16380 1727204154.00602: Calling groups_inventory to load vars for managed-node2 16380 1727204154.00604: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204154.00620: Calling all_plugins_play to load vars for managed-node2 16380 1727204154.00624: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204154.00628: Calling groups_plugins_play to load vars for managed-node2 16380 1727204154.01207: done sending task result for task 12b410aa-8751-749c-b6eb-00000000001b 16380 1727204154.01211: WORKER PROCESS EXITING 16380 1727204154.03350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204154.14980: done with get_vars() 16380 1727204154.15026: done getting variables 16380 1727204154.15102: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.262) 0:00:15.258 ***** 16380 1727204154.15168: entering _queue_task() for managed-node2/package 16380 1727204154.15823: worker is 1 (out of 1 available) 16380 1727204154.15835: exiting _queue_task() for managed-node2/package 16380 1727204154.15847: done queuing things up, now waiting for results queue to drain 16380 1727204154.15850: waiting for pending results... 16380 1727204154.16072: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16380 1727204154.16218: in run() - task 12b410aa-8751-749c-b6eb-00000000001c 16380 1727204154.16249: variable 'ansible_search_path' from source: unknown 16380 1727204154.16259: variable 'ansible_search_path' from source: unknown 16380 1727204154.16311: calling self._execute() 16380 1727204154.16429: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.16445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.16473: variable 'omit' from source: magic vars 16380 1727204154.17242: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.17344: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204154.17595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204154.18497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204154.18509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204154.18738: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204154.18829: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204154.19177: variable 'network_packages' from source: role '' defaults 16380 1727204154.19370: variable '__network_provider_setup' from source: role '' defaults 16380 1727204154.19393: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204154.19514: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204154.19535: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204154.19667: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204154.20034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204154.23709: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204154.23806: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204154.23874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204154.23924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204154.23969: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204154.24076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.24195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.24201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.24222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.24282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.24420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.24698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.24701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.24807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.24812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.25426: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16380 1727204154.25782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.25822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.25887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.25999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.26023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.26144: variable 'ansible_python' from source: facts 16380 1727204154.26197: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16380 1727204154.26321: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204154.26435: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204154.26615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.26651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.26718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.26753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.26776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.26895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.26908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.26958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.27016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.27154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.27272: variable 'network_connections' from source: play vars 16380 1727204154.27287: variable 'interface' from source: set_fact 16380 1727204154.27433: variable 'interface' from source: set_fact 16380 1727204154.27481: variable 'interface' from source: set_fact 16380 1727204154.27598: variable 'interface' from source: set_fact 16380 1727204154.27821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204154.28075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204154.28145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.28496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204154.28500: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204154.29802: variable 'network_connections' from source: play vars 16380 1727204154.29814: variable 'interface' from source: set_fact 16380 1727204154.29951: variable 'interface' from source: set_fact 16380 1727204154.30078: variable 'interface' from source: set_fact 16380 1727204154.30325: variable 'interface' from source: set_fact 16380 1727204154.30537: variable '__network_packages_default_wireless' from source: role '' defaults 16380 1727204154.30654: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204154.31330: variable 'network_connections' from source: play vars 16380 1727204154.31364: variable 'interface' from source: set_fact 16380 1727204154.31457: variable 'interface' from source: set_fact 16380 1727204154.31475: variable 'interface' from source: set_fact 16380 1727204154.31566: variable 'interface' from source: set_fact 16380 1727204154.31818: variable '__network_packages_default_team' from source: role '' defaults 16380 1727204154.32012: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204154.33054: variable 'network_connections' from source: play vars 16380 1727204154.33161: variable 'interface' from source: set_fact 16380 1727204154.33231: variable 'interface' from source: set_fact 16380 1727204154.33291: variable 'interface' from source: set_fact 16380 1727204154.33467: variable 'interface' from source: set_fact 16380 1727204154.33582: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204154.33779: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204154.33907: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204154.33993: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204154.34517: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16380 1727204154.34946: variable 'network_connections' from source: play vars 16380 1727204154.34949: variable 'interface' from source: set_fact 16380 1727204154.35003: variable 'interface' from source: set_fact 16380 1727204154.35010: variable 'interface' from source: set_fact 16380 1727204154.35064: variable 'interface' from source: set_fact 16380 1727204154.35073: variable 'ansible_distribution' from source: facts 16380 1727204154.35078: variable '__network_rh_distros' from source: role '' defaults 16380 1727204154.35084: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.35107: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16380 1727204154.35255: variable 'ansible_distribution' from source: facts 16380 1727204154.35259: variable '__network_rh_distros' from source: role '' defaults 16380 1727204154.35265: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.35272: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16380 1727204154.35418: variable 'ansible_distribution' from source: facts 16380 1727204154.35421: variable '__network_rh_distros' from source: role '' defaults 16380 1727204154.35428: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.35464: variable 'network_provider' from source: set_fact 16380 1727204154.35498: variable 'ansible_facts' from source: unknown 16380 1727204154.36924: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16380 1727204154.36928: when evaluation is False, skipping this task 16380 1727204154.36930: _execute() done 16380 1727204154.36933: dumping result to json 16380 1727204154.36937: done dumping result, returning 16380 1727204154.36940: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-749c-b6eb-00000000001c] 16380 1727204154.36942: sending task result for task 12b410aa-8751-749c-b6eb-00000000001c 16380 1727204154.37025: done sending task result for task 12b410aa-8751-749c-b6eb-00000000001c 16380 1727204154.37028: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16380 1727204154.37087: no more pending results, returning what we have 16380 1727204154.37093: results queue empty 16380 1727204154.37094: checking for any_errors_fatal 16380 1727204154.37102: done checking for any_errors_fatal 16380 1727204154.37103: checking for max_fail_percentage 16380 1727204154.37105: done checking for max_fail_percentage 16380 1727204154.37106: checking to see if all hosts have failed and the running result is not ok 16380 1727204154.37107: done checking to see if all hosts have failed 16380 1727204154.37107: getting the remaining hosts for this loop 16380 1727204154.37109: done getting the remaining hosts for this loop 16380 1727204154.37114: getting the next task for host managed-node2 16380 1727204154.37121: done getting next task for host managed-node2 16380 1727204154.37125: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16380 1727204154.37127: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204154.37145: getting variables 16380 1727204154.37147: in VariableManager get_vars() 16380 1727204154.37313: Calling all_inventory to load vars for managed-node2 16380 1727204154.37320: Calling groups_inventory to load vars for managed-node2 16380 1727204154.37324: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204154.37341: Calling all_plugins_play to load vars for managed-node2 16380 1727204154.37344: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204154.37348: Calling groups_plugins_play to load vars for managed-node2 16380 1727204154.39679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204154.43195: done with get_vars() 16380 1727204154.43245: done getting variables 16380 1727204154.43317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.281) 0:00:15.540 ***** 16380 1727204154.43364: entering _queue_task() for managed-node2/package 16380 1727204154.43778: worker is 1 (out of 1 available) 16380 1727204154.43798: exiting _queue_task() for managed-node2/package 16380 1727204154.43814: done queuing things up, now waiting for results queue to drain 16380 1727204154.43816: waiting for pending results... 16380 1727204154.44504: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16380 1727204154.44512: in run() - task 12b410aa-8751-749c-b6eb-00000000001d 16380 1727204154.44516: variable 'ansible_search_path' from source: unknown 16380 1727204154.44519: variable 'ansible_search_path' from source: unknown 16380 1727204154.44521: calling self._execute() 16380 1727204154.44721: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.44727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.44730: variable 'omit' from source: magic vars 16380 1727204154.45107: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.45127: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204154.45287: variable 'network_state' from source: role '' defaults 16380 1727204154.45303: Evaluated conditional (network_state != {}): False 16380 1727204154.45307: when evaluation is False, skipping this task 16380 1727204154.45313: _execute() done 16380 1727204154.45316: dumping result to json 16380 1727204154.45318: done dumping result, returning 16380 1727204154.45327: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-749c-b6eb-00000000001d] 16380 1727204154.45342: sending task result for task 12b410aa-8751-749c-b6eb-00000000001d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204154.45506: no more pending results, returning what we have 16380 1727204154.45511: results queue empty 16380 1727204154.45512: checking for any_errors_fatal 16380 1727204154.45521: done checking for any_errors_fatal 16380 1727204154.45522: checking for max_fail_percentage 16380 1727204154.45524: done checking for max_fail_percentage 16380 1727204154.45525: checking to see if all hosts have failed and the running result is not ok 16380 1727204154.45526: done checking to see if all hosts have failed 16380 1727204154.45527: getting the remaining hosts for this loop 16380 1727204154.45529: done getting the remaining hosts for this loop 16380 1727204154.45534: getting the next task for host managed-node2 16380 1727204154.45542: done getting next task for host managed-node2 16380 1727204154.45546: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16380 1727204154.45549: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204154.45567: getting variables 16380 1727204154.45569: in VariableManager get_vars() 16380 1727204154.45617: Calling all_inventory to load vars for managed-node2 16380 1727204154.45621: Calling groups_inventory to load vars for managed-node2 16380 1727204154.45624: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204154.45640: Calling all_plugins_play to load vars for managed-node2 16380 1727204154.45644: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204154.45649: Calling groups_plugins_play to load vars for managed-node2 16380 1727204154.46171: done sending task result for task 12b410aa-8751-749c-b6eb-00000000001d 16380 1727204154.46175: WORKER PROCESS EXITING 16380 1727204154.49683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204154.54367: done with get_vars() 16380 1727204154.54423: done getting variables 16380 1727204154.54582: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.112) 0:00:15.653 ***** 16380 1727204154.54659: entering _queue_task() for managed-node2/package 16380 1727204154.55172: worker is 1 (out of 1 available) 16380 1727204154.55191: exiting _queue_task() for managed-node2/package 16380 1727204154.55206: done queuing things up, now waiting for results queue to drain 16380 1727204154.55211: waiting for pending results... 16380 1727204154.55595: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16380 1727204154.55767: in run() - task 12b410aa-8751-749c-b6eb-00000000001e 16380 1727204154.55792: variable 'ansible_search_path' from source: unknown 16380 1727204154.55858: variable 'ansible_search_path' from source: unknown 16380 1727204154.55978: calling self._execute() 16380 1727204154.56034: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.56050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.56069: variable 'omit' from source: magic vars 16380 1727204154.56668: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.56755: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204154.56913: variable 'network_state' from source: role '' defaults 16380 1727204154.56931: Evaluated conditional (network_state != {}): False 16380 1727204154.56945: when evaluation is False, skipping this task 16380 1727204154.56972: _execute() done 16380 1727204154.56975: dumping result to json 16380 1727204154.56996: done dumping result, returning 16380 1727204154.57199: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-749c-b6eb-00000000001e] 16380 1727204154.57206: sending task result for task 12b410aa-8751-749c-b6eb-00000000001e 16380 1727204154.57300: done sending task result for task 12b410aa-8751-749c-b6eb-00000000001e 16380 1727204154.57305: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204154.57368: no more pending results, returning what we have 16380 1727204154.57373: results queue empty 16380 1727204154.57374: checking for any_errors_fatal 16380 1727204154.57384: done checking for any_errors_fatal 16380 1727204154.57385: checking for max_fail_percentage 16380 1727204154.57387: done checking for max_fail_percentage 16380 1727204154.57388: checking to see if all hosts have failed and the running result is not ok 16380 1727204154.57392: done checking to see if all hosts have failed 16380 1727204154.57393: getting the remaining hosts for this loop 16380 1727204154.57395: done getting the remaining hosts for this loop 16380 1727204154.57400: getting the next task for host managed-node2 16380 1727204154.57407: done getting next task for host managed-node2 16380 1727204154.57412: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16380 1727204154.57416: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204154.57438: getting variables 16380 1727204154.57440: in VariableManager get_vars() 16380 1727204154.57642: Calling all_inventory to load vars for managed-node2 16380 1727204154.57646: Calling groups_inventory to load vars for managed-node2 16380 1727204154.57650: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204154.57726: Calling all_plugins_play to load vars for managed-node2 16380 1727204154.57732: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204154.57737: Calling groups_plugins_play to load vars for managed-node2 16380 1727204154.61984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204154.65813: done with get_vars() 16380 1727204154.65959: done getting variables 16380 1727204154.66138: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.115) 0:00:15.768 ***** 16380 1727204154.66172: entering _queue_task() for managed-node2/service 16380 1727204154.66174: Creating lock for service 16380 1727204154.67075: worker is 1 (out of 1 available) 16380 1727204154.67087: exiting _queue_task() for managed-node2/service 16380 1727204154.67102: done queuing things up, now waiting for results queue to drain 16380 1727204154.67105: waiting for pending results... 16380 1727204154.67520: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16380 1727204154.67528: in run() - task 12b410aa-8751-749c-b6eb-00000000001f 16380 1727204154.67594: variable 'ansible_search_path' from source: unknown 16380 1727204154.67598: variable 'ansible_search_path' from source: unknown 16380 1727204154.67606: calling self._execute() 16380 1727204154.67738: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.67752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.67768: variable 'omit' from source: magic vars 16380 1727204154.68320: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.68342: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204154.68536: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204154.68895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204154.71717: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204154.71817: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204154.71849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204154.71893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204154.71924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204154.72031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.72062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.72103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.72159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.72175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.72222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.72248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.72274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.72355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.72366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.72432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.72435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.72465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.72503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.72517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.72661: variable 'network_connections' from source: play vars 16380 1727204154.72673: variable 'interface' from source: set_fact 16380 1727204154.72750: variable 'interface' from source: set_fact 16380 1727204154.72758: variable 'interface' from source: set_fact 16380 1727204154.72818: variable 'interface' from source: set_fact 16380 1727204154.72873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204154.73032: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204154.73093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204154.73114: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204154.73143: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204154.73208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204154.73223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204154.73251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.73283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204154.73356: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204154.73592: variable 'network_connections' from source: play vars 16380 1727204154.73596: variable 'interface' from source: set_fact 16380 1727204154.73685: variable 'interface' from source: set_fact 16380 1727204154.73688: variable 'interface' from source: set_fact 16380 1727204154.73759: variable 'interface' from source: set_fact 16380 1727204154.73787: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204154.73804: when evaluation is False, skipping this task 16380 1727204154.73807: _execute() done 16380 1727204154.73812: dumping result to json 16380 1727204154.73814: done dumping result, returning 16380 1727204154.73817: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-00000000001f] 16380 1727204154.73827: sending task result for task 12b410aa-8751-749c-b6eb-00000000001f 16380 1727204154.73978: done sending task result for task 12b410aa-8751-749c-b6eb-00000000001f 16380 1727204154.73980: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204154.74050: no more pending results, returning what we have 16380 1727204154.74053: results queue empty 16380 1727204154.74054: checking for any_errors_fatal 16380 1727204154.74061: done checking for any_errors_fatal 16380 1727204154.74062: checking for max_fail_percentage 16380 1727204154.74064: done checking for max_fail_percentage 16380 1727204154.74065: checking to see if all hosts have failed and the running result is not ok 16380 1727204154.74065: done checking to see if all hosts have failed 16380 1727204154.74066: getting the remaining hosts for this loop 16380 1727204154.74073: done getting the remaining hosts for this loop 16380 1727204154.74077: getting the next task for host managed-node2 16380 1727204154.74084: done getting next task for host managed-node2 16380 1727204154.74090: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16380 1727204154.74093: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204154.74108: getting variables 16380 1727204154.74112: in VariableManager get_vars() 16380 1727204154.74150: Calling all_inventory to load vars for managed-node2 16380 1727204154.74153: Calling groups_inventory to load vars for managed-node2 16380 1727204154.74155: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204154.74169: Calling all_plugins_play to load vars for managed-node2 16380 1727204154.74173: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204154.74177: Calling groups_plugins_play to load vars for managed-node2 16380 1727204154.76358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204154.79663: done with get_vars() 16380 1727204154.79694: done getting variables 16380 1727204154.79747: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.135) 0:00:15.904 ***** 16380 1727204154.79772: entering _queue_task() for managed-node2/service 16380 1727204154.80041: worker is 1 (out of 1 available) 16380 1727204154.80056: exiting _queue_task() for managed-node2/service 16380 1727204154.80069: done queuing things up, now waiting for results queue to drain 16380 1727204154.80071: waiting for pending results... 16380 1727204154.80261: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16380 1727204154.80342: in run() - task 12b410aa-8751-749c-b6eb-000000000020 16380 1727204154.80355: variable 'ansible_search_path' from source: unknown 16380 1727204154.80359: variable 'ansible_search_path' from source: unknown 16380 1727204154.80405: calling self._execute() 16380 1727204154.80552: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.80557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.80560: variable 'omit' from source: magic vars 16380 1727204154.81031: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.81035: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204154.81210: variable 'network_provider' from source: set_fact 16380 1727204154.81219: variable 'network_state' from source: role '' defaults 16380 1727204154.81231: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16380 1727204154.81242: variable 'omit' from source: magic vars 16380 1727204154.81288: variable 'omit' from source: magic vars 16380 1727204154.81326: variable 'network_service_name' from source: role '' defaults 16380 1727204154.81417: variable 'network_service_name' from source: role '' defaults 16380 1727204154.81550: variable '__network_provider_setup' from source: role '' defaults 16380 1727204154.81581: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204154.81693: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204154.81700: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204154.81755: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204154.82104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204154.85268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204154.85301: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204154.85423: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204154.85471: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204154.85521: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204154.85655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.85738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.85751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.85857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.85922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.86095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.86098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.86100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.86199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.86220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.86524: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16380 1727204154.86674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.86704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.86737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.86786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.86804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.86982: variable 'ansible_python' from source: facts 16380 1727204154.86987: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16380 1727204154.87044: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204154.87160: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204154.87325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.87416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.87538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.87543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.87546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.87615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204154.87673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204154.87711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.87782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204154.87808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204154.88085: variable 'network_connections' from source: play vars 16380 1727204154.88088: variable 'interface' from source: set_fact 16380 1727204154.88131: variable 'interface' from source: set_fact 16380 1727204154.88150: variable 'interface' from source: set_fact 16380 1727204154.88256: variable 'interface' from source: set_fact 16380 1727204154.88399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204154.88682: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204154.88764: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204154.88843: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204154.88894: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204154.89194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204154.89198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204154.89201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204154.89203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204154.89206: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204154.89579: variable 'network_connections' from source: play vars 16380 1727204154.89594: variable 'interface' from source: set_fact 16380 1727204154.89697: variable 'interface' from source: set_fact 16380 1727204154.89715: variable 'interface' from source: set_fact 16380 1727204154.89817: variable 'interface' from source: set_fact 16380 1727204154.89887: variable '__network_packages_default_wireless' from source: role '' defaults 16380 1727204154.90006: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204154.90434: variable 'network_connections' from source: play vars 16380 1727204154.90445: variable 'interface' from source: set_fact 16380 1727204154.90543: variable 'interface' from source: set_fact 16380 1727204154.90556: variable 'interface' from source: set_fact 16380 1727204154.90653: variable 'interface' from source: set_fact 16380 1727204154.90687: variable '__network_packages_default_team' from source: role '' defaults 16380 1727204154.90808: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204154.91238: variable 'network_connections' from source: play vars 16380 1727204154.91250: variable 'interface' from source: set_fact 16380 1727204154.91397: variable 'interface' from source: set_fact 16380 1727204154.91406: variable 'interface' from source: set_fact 16380 1727204154.91460: variable 'interface' from source: set_fact 16380 1727204154.91556: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204154.91646: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204154.91659: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204154.91748: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204154.92084: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16380 1727204154.92926: variable 'network_connections' from source: play vars 16380 1727204154.92931: variable 'interface' from source: set_fact 16380 1727204154.92940: variable 'interface' from source: set_fact 16380 1727204154.92953: variable 'interface' from source: set_fact 16380 1727204154.93035: variable 'interface' from source: set_fact 16380 1727204154.93059: variable 'ansible_distribution' from source: facts 16380 1727204154.93069: variable '__network_rh_distros' from source: role '' defaults 16380 1727204154.93081: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.93115: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16380 1727204154.93382: variable 'ansible_distribution' from source: facts 16380 1727204154.93471: variable '__network_rh_distros' from source: role '' defaults 16380 1727204154.93477: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.93480: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16380 1727204154.93668: variable 'ansible_distribution' from source: facts 16380 1727204154.93678: variable '__network_rh_distros' from source: role '' defaults 16380 1727204154.93699: variable 'ansible_distribution_major_version' from source: facts 16380 1727204154.93750: variable 'network_provider' from source: set_fact 16380 1727204154.93796: variable 'omit' from source: magic vars 16380 1727204154.93832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204154.93895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204154.93898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204154.93935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204154.93954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204154.93994: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204154.94015: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.94018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.94166: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204154.94180: Set connection var ansible_shell_executable to /bin/sh 16380 1727204154.94233: Set connection var ansible_connection to ssh 16380 1727204154.94236: Set connection var ansible_shell_type to sh 16380 1727204154.94239: Set connection var ansible_pipelining to False 16380 1727204154.94241: Set connection var ansible_timeout to 10 16380 1727204154.94270: variable 'ansible_shell_executable' from source: unknown 16380 1727204154.94278: variable 'ansible_connection' from source: unknown 16380 1727204154.94286: variable 'ansible_module_compression' from source: unknown 16380 1727204154.94298: variable 'ansible_shell_type' from source: unknown 16380 1727204154.94306: variable 'ansible_shell_executable' from source: unknown 16380 1727204154.94343: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204154.94351: variable 'ansible_pipelining' from source: unknown 16380 1727204154.94354: variable 'ansible_timeout' from source: unknown 16380 1727204154.94361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204154.94498: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204154.94561: variable 'omit' from source: magic vars 16380 1727204154.94564: starting attempt loop 16380 1727204154.94567: running the handler 16380 1727204154.94646: variable 'ansible_facts' from source: unknown 16380 1727204154.95919: _low_level_execute_command(): starting 16380 1727204154.95933: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204154.96744: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204154.96878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204154.96898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204154.97057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204154.98855: stdout chunk (state=3): >>>/root <<< 16380 1727204154.98966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204154.99060: stderr chunk (state=3): >>><<< 16380 1727204154.99073: stdout chunk (state=3): >>><<< 16380 1727204154.99103: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204154.99123: _low_level_execute_command(): starting 16380 1727204154.99135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152 `" && echo ansible-tmp-1727204154.9911082-17780-81816795607152="` echo /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152 `" ) && sleep 0' 16380 1727204154.99791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204154.99916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204154.99935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204154.99959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204155.00039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204155.02214: stdout chunk (state=3): >>>ansible-tmp-1727204154.9911082-17780-81816795607152=/root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152 <<< 16380 1727204155.02343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204155.02443: stderr chunk (state=3): >>><<< 16380 1727204155.02453: stdout chunk (state=3): >>><<< 16380 1727204155.02486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204154.9911082-17780-81816795607152=/root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204155.02532: variable 'ansible_module_compression' from source: unknown 16380 1727204155.02682: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 16380 1727204155.02686: ANSIBALLZ: Acquiring lock 16380 1727204155.02690: ANSIBALLZ: Lock acquired: 140602939598528 16380 1727204155.02693: ANSIBALLZ: Creating module 16380 1727204155.43904: ANSIBALLZ: Writing module into payload 16380 1727204155.44051: ANSIBALLZ: Writing module 16380 1727204155.44077: ANSIBALLZ: Renaming module 16380 1727204155.44081: ANSIBALLZ: Done creating module 16380 1727204155.44123: variable 'ansible_facts' from source: unknown 16380 1727204155.44266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py 16380 1727204155.44395: Sending initial data 16380 1727204155.44404: Sent initial data (155 bytes) 16380 1727204155.44998: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204155.45113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204155.45155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204155.45199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204155.46964: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 16380 1727204155.46980: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204155.47012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204155.47045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp21iu8ucn /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py <<< 16380 1727204155.47052: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py" <<< 16380 1727204155.47084: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp21iu8ucn" to remote "/root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py" <<< 16380 1727204155.54151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204155.54238: stderr chunk (state=3): >>><<< 16380 1727204155.54243: stdout chunk (state=3): >>><<< 16380 1727204155.54278: done transferring module to remote 16380 1727204155.54319: _low_level_execute_command(): starting 16380 1727204155.54358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/ /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py && sleep 0' 16380 1727204155.54999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204155.55065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204155.55102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204155.57068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204155.57118: stderr chunk (state=3): >>><<< 16380 1727204155.57123: stdout chunk (state=3): >>><<< 16380 1727204155.57140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204155.57145: _low_level_execute_command(): starting 16380 1727204155.57148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/AnsiballZ_systemd.py && sleep 0' 16380 1727204155.57578: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204155.57612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204155.57616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204155.57619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204155.57677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204155.57680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204155.57728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204155.91394: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4382720", "MemoryAvailable": "infinity", "CPUUsageNSec": "1146705000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 16380 1727204155.91418: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 16380 1727204155.91428: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16380 1727204155.93552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204155.93620: stderr chunk (state=3): >>><<< 16380 1727204155.93624: stdout chunk (state=3): >>><<< 16380 1727204155.93642: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4382720", "MemoryAvailable": "infinity", "CPUUsageNSec": "1146705000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204155.93816: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204155.93835: _low_level_execute_command(): starting 16380 1727204155.93842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204154.9911082-17780-81816795607152/ > /dev/null 2>&1 && sleep 0' 16380 1727204155.94343: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204155.94346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204155.94349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204155.94351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204155.94416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204155.94420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204155.94461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204155.96461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204155.96514: stderr chunk (state=3): >>><<< 16380 1727204155.96518: stdout chunk (state=3): >>><<< 16380 1727204155.96533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204155.96543: handler run complete 16380 1727204155.96592: attempt loop complete, returning result 16380 1727204155.96596: _execute() done 16380 1727204155.96598: dumping result to json 16380 1727204155.96617: done dumping result, returning 16380 1727204155.96625: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-749c-b6eb-000000000020] 16380 1727204155.96632: sending task result for task 12b410aa-8751-749c-b6eb-000000000020 16380 1727204155.96918: done sending task result for task 12b410aa-8751-749c-b6eb-000000000020 16380 1727204155.96921: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204155.96980: no more pending results, returning what we have 16380 1727204155.96983: results queue empty 16380 1727204155.96984: checking for any_errors_fatal 16380 1727204155.96992: done checking for any_errors_fatal 16380 1727204155.96993: checking for max_fail_percentage 16380 1727204155.96994: done checking for max_fail_percentage 16380 1727204155.96995: checking to see if all hosts have failed and the running result is not ok 16380 1727204155.96996: done checking to see if all hosts have failed 16380 1727204155.96997: getting the remaining hosts for this loop 16380 1727204155.96999: done getting the remaining hosts for this loop 16380 1727204155.97003: getting the next task for host managed-node2 16380 1727204155.97012: done getting next task for host managed-node2 16380 1727204155.97016: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16380 1727204155.97019: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204155.97039: getting variables 16380 1727204155.97041: in VariableManager get_vars() 16380 1727204155.97077: Calling all_inventory to load vars for managed-node2 16380 1727204155.97081: Calling groups_inventory to load vars for managed-node2 16380 1727204155.97083: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204155.97098: Calling all_plugins_play to load vars for managed-node2 16380 1727204155.97101: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204155.97105: Calling groups_plugins_play to load vars for managed-node2 16380 1727204155.98479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204156.00051: done with get_vars() 16380 1727204156.00075: done getting variables 16380 1727204156.00129: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:56 -0400 (0:00:01.203) 0:00:17.108 ***** 16380 1727204156.00153: entering _queue_task() for managed-node2/service 16380 1727204156.00400: worker is 1 (out of 1 available) 16380 1727204156.00413: exiting _queue_task() for managed-node2/service 16380 1727204156.00427: done queuing things up, now waiting for results queue to drain 16380 1727204156.00429: waiting for pending results... 16380 1727204156.00624: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16380 1727204156.00706: in run() - task 12b410aa-8751-749c-b6eb-000000000021 16380 1727204156.00720: variable 'ansible_search_path' from source: unknown 16380 1727204156.00724: variable 'ansible_search_path' from source: unknown 16380 1727204156.00769: calling self._execute() 16380 1727204156.00849: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204156.00856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204156.00866: variable 'omit' from source: magic vars 16380 1727204156.01210: variable 'ansible_distribution_major_version' from source: facts 16380 1727204156.01223: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204156.01324: variable 'network_provider' from source: set_fact 16380 1727204156.01327: Evaluated conditional (network_provider == "nm"): True 16380 1727204156.01406: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204156.01484: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204156.01640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204156.03325: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204156.03380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204156.03418: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204156.03449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204156.03471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204156.03553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204156.03578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204156.03601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204156.03641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204156.03654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204156.03695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204156.03719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204156.03744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204156.03775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204156.03787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204156.03826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204156.03851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204156.03871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204156.03902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204156.03917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204156.04039: variable 'network_connections' from source: play vars 16380 1727204156.04054: variable 'interface' from source: set_fact 16380 1727204156.04116: variable 'interface' from source: set_fact 16380 1727204156.04124: variable 'interface' from source: set_fact 16380 1727204156.04176: variable 'interface' from source: set_fact 16380 1727204156.04241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204156.04378: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204156.04417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204156.04444: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204156.04469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204156.04512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204156.04534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204156.04554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204156.04575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204156.04623: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204156.04977: variable 'network_connections' from source: play vars 16380 1727204156.04980: variable 'interface' from source: set_fact 16380 1727204156.05037: variable 'interface' from source: set_fact 16380 1727204156.05041: variable 'interface' from source: set_fact 16380 1727204156.05096: variable 'interface' from source: set_fact 16380 1727204156.05132: Evaluated conditional (__network_wpa_supplicant_required): False 16380 1727204156.05136: when evaluation is False, skipping this task 16380 1727204156.05139: _execute() done 16380 1727204156.05153: dumping result to json 16380 1727204156.05156: done dumping result, returning 16380 1727204156.05159: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-749c-b6eb-000000000021] 16380 1727204156.05163: sending task result for task 12b410aa-8751-749c-b6eb-000000000021 16380 1727204156.05256: done sending task result for task 12b410aa-8751-749c-b6eb-000000000021 16380 1727204156.05259: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16380 1727204156.05324: no more pending results, returning what we have 16380 1727204156.05327: results queue empty 16380 1727204156.05328: checking for any_errors_fatal 16380 1727204156.05358: done checking for any_errors_fatal 16380 1727204156.05359: checking for max_fail_percentage 16380 1727204156.05361: done checking for max_fail_percentage 16380 1727204156.05362: checking to see if all hosts have failed and the running result is not ok 16380 1727204156.05363: done checking to see if all hosts have failed 16380 1727204156.05364: getting the remaining hosts for this loop 16380 1727204156.05366: done getting the remaining hosts for this loop 16380 1727204156.05370: getting the next task for host managed-node2 16380 1727204156.05377: done getting next task for host managed-node2 16380 1727204156.05381: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16380 1727204156.05383: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204156.05399: getting variables 16380 1727204156.05401: in VariableManager get_vars() 16380 1727204156.05438: Calling all_inventory to load vars for managed-node2 16380 1727204156.05442: Calling groups_inventory to load vars for managed-node2 16380 1727204156.05444: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204156.05454: Calling all_plugins_play to load vars for managed-node2 16380 1727204156.05457: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204156.05460: Calling groups_plugins_play to load vars for managed-node2 16380 1727204156.06785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204156.08366: done with get_vars() 16380 1727204156.08387: done getting variables 16380 1727204156.08444: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.083) 0:00:17.191 ***** 16380 1727204156.08467: entering _queue_task() for managed-node2/service 16380 1727204156.08696: worker is 1 (out of 1 available) 16380 1727204156.08713: exiting _queue_task() for managed-node2/service 16380 1727204156.08725: done queuing things up, now waiting for results queue to drain 16380 1727204156.08728: waiting for pending results... 16380 1727204156.08908: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16380 1727204156.08982: in run() - task 12b410aa-8751-749c-b6eb-000000000022 16380 1727204156.08997: variable 'ansible_search_path' from source: unknown 16380 1727204156.09001: variable 'ansible_search_path' from source: unknown 16380 1727204156.09034: calling self._execute() 16380 1727204156.09116: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204156.09122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204156.09132: variable 'omit' from source: magic vars 16380 1727204156.09455: variable 'ansible_distribution_major_version' from source: facts 16380 1727204156.09465: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204156.09568: variable 'network_provider' from source: set_fact 16380 1727204156.09572: Evaluated conditional (network_provider == "initscripts"): False 16380 1727204156.09575: when evaluation is False, skipping this task 16380 1727204156.09580: _execute() done 16380 1727204156.09583: dumping result to json 16380 1727204156.09590: done dumping result, returning 16380 1727204156.09598: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-749c-b6eb-000000000022] 16380 1727204156.09604: sending task result for task 12b410aa-8751-749c-b6eb-000000000022 16380 1727204156.09696: done sending task result for task 12b410aa-8751-749c-b6eb-000000000022 16380 1727204156.09699: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204156.09764: no more pending results, returning what we have 16380 1727204156.09768: results queue empty 16380 1727204156.09769: checking for any_errors_fatal 16380 1727204156.09776: done checking for any_errors_fatal 16380 1727204156.09777: checking for max_fail_percentage 16380 1727204156.09780: done checking for max_fail_percentage 16380 1727204156.09780: checking to see if all hosts have failed and the running result is not ok 16380 1727204156.09781: done checking to see if all hosts have failed 16380 1727204156.09782: getting the remaining hosts for this loop 16380 1727204156.09784: done getting the remaining hosts for this loop 16380 1727204156.09787: getting the next task for host managed-node2 16380 1727204156.09795: done getting next task for host managed-node2 16380 1727204156.09798: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16380 1727204156.09801: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204156.09818: getting variables 16380 1727204156.09819: in VariableManager get_vars() 16380 1727204156.09853: Calling all_inventory to load vars for managed-node2 16380 1727204156.09856: Calling groups_inventory to load vars for managed-node2 16380 1727204156.09859: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204156.09868: Calling all_plugins_play to load vars for managed-node2 16380 1727204156.09870: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204156.09872: Calling groups_plugins_play to load vars for managed-node2 16380 1727204156.11061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204156.12652: done with get_vars() 16380 1727204156.12674: done getting variables 16380 1727204156.12729: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.042) 0:00:17.234 ***** 16380 1727204156.12754: entering _queue_task() for managed-node2/copy 16380 1727204156.12998: worker is 1 (out of 1 available) 16380 1727204156.13013: exiting _queue_task() for managed-node2/copy 16380 1727204156.13026: done queuing things up, now waiting for results queue to drain 16380 1727204156.13028: waiting for pending results... 16380 1727204156.13207: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16380 1727204156.13288: in run() - task 12b410aa-8751-749c-b6eb-000000000023 16380 1727204156.13305: variable 'ansible_search_path' from source: unknown 16380 1727204156.13308: variable 'ansible_search_path' from source: unknown 16380 1727204156.13340: calling self._execute() 16380 1727204156.13422: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204156.13428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204156.13438: variable 'omit' from source: magic vars 16380 1727204156.13765: variable 'ansible_distribution_major_version' from source: facts 16380 1727204156.13776: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204156.13879: variable 'network_provider' from source: set_fact 16380 1727204156.13883: Evaluated conditional (network_provider == "initscripts"): False 16380 1727204156.13886: when evaluation is False, skipping this task 16380 1727204156.13893: _execute() done 16380 1727204156.13896: dumping result to json 16380 1727204156.13905: done dumping result, returning 16380 1727204156.13921: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-749c-b6eb-000000000023] 16380 1727204156.13926: sending task result for task 12b410aa-8751-749c-b6eb-000000000023 16380 1727204156.14022: done sending task result for task 12b410aa-8751-749c-b6eb-000000000023 16380 1727204156.14025: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16380 1727204156.14076: no more pending results, returning what we have 16380 1727204156.14079: results queue empty 16380 1727204156.14081: checking for any_errors_fatal 16380 1727204156.14086: done checking for any_errors_fatal 16380 1727204156.14087: checking for max_fail_percentage 16380 1727204156.14090: done checking for max_fail_percentage 16380 1727204156.14092: checking to see if all hosts have failed and the running result is not ok 16380 1727204156.14092: done checking to see if all hosts have failed 16380 1727204156.14093: getting the remaining hosts for this loop 16380 1727204156.14096: done getting the remaining hosts for this loop 16380 1727204156.14100: getting the next task for host managed-node2 16380 1727204156.14107: done getting next task for host managed-node2 16380 1727204156.14113: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16380 1727204156.14115: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204156.14129: getting variables 16380 1727204156.14130: in VariableManager get_vars() 16380 1727204156.14164: Calling all_inventory to load vars for managed-node2 16380 1727204156.14167: Calling groups_inventory to load vars for managed-node2 16380 1727204156.14170: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204156.14180: Calling all_plugins_play to load vars for managed-node2 16380 1727204156.14183: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204156.14186: Calling groups_plugins_play to load vars for managed-node2 16380 1727204156.15499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204156.17075: done with get_vars() 16380 1727204156.17105: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:56 -0400 (0:00:00.044) 0:00:17.278 ***** 16380 1727204156.17182: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16380 1727204156.17184: Creating lock for fedora.linux_system_roles.network_connections 16380 1727204156.17465: worker is 1 (out of 1 available) 16380 1727204156.17482: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16380 1727204156.17497: done queuing things up, now waiting for results queue to drain 16380 1727204156.17499: waiting for pending results... 16380 1727204156.17686: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16380 1727204156.17769: in run() - task 12b410aa-8751-749c-b6eb-000000000024 16380 1727204156.17782: variable 'ansible_search_path' from source: unknown 16380 1727204156.17785: variable 'ansible_search_path' from source: unknown 16380 1727204156.17823: calling self._execute() 16380 1727204156.17910: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204156.17918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204156.17929: variable 'omit' from source: magic vars 16380 1727204156.18268: variable 'ansible_distribution_major_version' from source: facts 16380 1727204156.18285: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204156.18290: variable 'omit' from source: magic vars 16380 1727204156.18327: variable 'omit' from source: magic vars 16380 1727204156.18468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204156.21011: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204156.21061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204156.21105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204156.21152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204156.21184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204156.21287: variable 'network_provider' from source: set_fact 16380 1727204156.21459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204156.21513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204156.21555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204156.21610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204156.21626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204156.21719: variable 'omit' from source: magic vars 16380 1727204156.21863: variable 'omit' from source: magic vars 16380 1727204156.21997: variable 'network_connections' from source: play vars 16380 1727204156.22011: variable 'interface' from source: set_fact 16380 1727204156.22099: variable 'interface' from source: set_fact 16380 1727204156.22104: variable 'interface' from source: set_fact 16380 1727204156.22180: variable 'interface' from source: set_fact 16380 1727204156.22376: variable 'omit' from source: magic vars 16380 1727204156.22386: variable '__lsr_ansible_managed' from source: task vars 16380 1727204156.22461: variable '__lsr_ansible_managed' from source: task vars 16380 1727204156.22836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16380 1727204156.23157: Loaded config def from plugin (lookup/template) 16380 1727204156.23163: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16380 1727204156.23196: File lookup term: get_ansible_managed.j2 16380 1727204156.23199: variable 'ansible_search_path' from source: unknown 16380 1727204156.23208: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16380 1727204156.23231: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16380 1727204156.23250: variable 'ansible_search_path' from source: unknown 16380 1727204156.29058: variable 'ansible_managed' from source: unknown 16380 1727204156.29188: variable 'omit' from source: magic vars 16380 1727204156.29217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204156.29241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204156.29258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204156.29274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204156.29289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204156.29316: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204156.29321: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204156.29324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204156.29407: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204156.29411: Set connection var ansible_shell_executable to /bin/sh 16380 1727204156.29420: Set connection var ansible_connection to ssh 16380 1727204156.29426: Set connection var ansible_shell_type to sh 16380 1727204156.29434: Set connection var ansible_pipelining to False 16380 1727204156.29443: Set connection var ansible_timeout to 10 16380 1727204156.29463: variable 'ansible_shell_executable' from source: unknown 16380 1727204156.29466: variable 'ansible_connection' from source: unknown 16380 1727204156.29470: variable 'ansible_module_compression' from source: unknown 16380 1727204156.29474: variable 'ansible_shell_type' from source: unknown 16380 1727204156.29476: variable 'ansible_shell_executable' from source: unknown 16380 1727204156.29481: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204156.29486: variable 'ansible_pipelining' from source: unknown 16380 1727204156.29491: variable 'ansible_timeout' from source: unknown 16380 1727204156.29496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204156.29621: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204156.29633: variable 'omit' from source: magic vars 16380 1727204156.29636: starting attempt loop 16380 1727204156.29639: running the handler 16380 1727204156.29657: _low_level_execute_command(): starting 16380 1727204156.29661: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204156.30219: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204156.30224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.30236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.30305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204156.30309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204156.30312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204156.30369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204156.32178: stdout chunk (state=3): >>>/root <<< 16380 1727204156.32288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204156.32352: stderr chunk (state=3): >>><<< 16380 1727204156.32355: stdout chunk (state=3): >>><<< 16380 1727204156.32376: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204156.32388: _low_level_execute_command(): starting 16380 1727204156.32397: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320 `" && echo ansible-tmp-1727204156.3237684-17812-202937526219320="` echo /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320 `" ) && sleep 0' 16380 1727204156.32861: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204156.32912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204156.32916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.32919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204156.32921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204156.32924: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.32968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204156.32974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204156.32976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204156.33020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204156.35111: stdout chunk (state=3): >>>ansible-tmp-1727204156.3237684-17812-202937526219320=/root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320 <<< 16380 1727204156.35229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204156.35285: stderr chunk (state=3): >>><<< 16380 1727204156.35292: stdout chunk (state=3): >>><<< 16380 1727204156.35308: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204156.3237684-17812-202937526219320=/root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204156.35351: variable 'ansible_module_compression' from source: unknown 16380 1727204156.35398: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 16380 1727204156.35403: ANSIBALLZ: Acquiring lock 16380 1727204156.35406: ANSIBALLZ: Lock acquired: 140602933511232 16380 1727204156.35408: ANSIBALLZ: Creating module 16380 1727204156.53084: ANSIBALLZ: Writing module into payload 16380 1727204156.53428: ANSIBALLZ: Writing module 16380 1727204156.53452: ANSIBALLZ: Renaming module 16380 1727204156.53458: ANSIBALLZ: Done creating module 16380 1727204156.53487: variable 'ansible_facts' from source: unknown 16380 1727204156.53558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py 16380 1727204156.53684: Sending initial data 16380 1727204156.53688: Sent initial data (168 bytes) 16380 1727204156.54204: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204156.54211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204156.54215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204156.54217: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204156.54220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.54266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204156.54281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204156.54339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204156.56098: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204156.56131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204156.56173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp06ablk75 /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py <<< 16380 1727204156.56181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py" <<< 16380 1727204156.56212: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp06ablk75" to remote "/root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py" <<< 16380 1727204156.57342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204156.57417: stderr chunk (state=3): >>><<< 16380 1727204156.57420: stdout chunk (state=3): >>><<< 16380 1727204156.57442: done transferring module to remote 16380 1727204156.57453: _low_level_execute_command(): starting 16380 1727204156.57458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/ /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py && sleep 0' 16380 1727204156.57950: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204156.57953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.57956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204156.57958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.58011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204156.58017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204156.58058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204156.59998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204156.60030: stderr chunk (state=3): >>><<< 16380 1727204156.60033: stdout chunk (state=3): >>><<< 16380 1727204156.60049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204156.60052: _low_level_execute_command(): starting 16380 1727204156.60058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/AnsiballZ_network_connections.py && sleep 0' 16380 1727204156.60494: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204156.60539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204156.60543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.60546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204156.60548: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204156.60552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204156.60557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.60602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204156.60606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204156.60652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204156.95595: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16380 1727204156.97841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204156.97903: stderr chunk (state=3): >>><<< 16380 1727204156.97907: stdout chunk (state=3): >>><<< 16380 1727204156.97926: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204156.97970: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204156.97979: _low_level_execute_command(): starting 16380 1727204156.97985: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204156.3237684-17812-202937526219320/ > /dev/null 2>&1 && sleep 0' 16380 1727204156.98461: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204156.98504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204156.98508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.98513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204156.98515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204156.98518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204156.98556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204156.98574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204156.98619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.00691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204157.00743: stderr chunk (state=3): >>><<< 16380 1727204157.00746: stdout chunk (state=3): >>><<< 16380 1727204157.00764: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204157.00772: handler run complete 16380 1727204157.00806: attempt loop complete, returning result 16380 1727204157.00812: _execute() done 16380 1727204157.00815: dumping result to json 16380 1727204157.00822: done dumping result, returning 16380 1727204157.00832: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-749c-b6eb-000000000024] 16380 1727204157.00837: sending task result for task 12b410aa-8751-749c-b6eb-000000000024 16380 1727204157.00954: done sending task result for task 12b410aa-8751-749c-b6eb-000000000024 16380 1727204157.00957: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced (not-active) 16380 1727204157.01103: no more pending results, returning what we have 16380 1727204157.01106: results queue empty 16380 1727204157.01107: checking for any_errors_fatal 16380 1727204157.01117: done checking for any_errors_fatal 16380 1727204157.01118: checking for max_fail_percentage 16380 1727204157.01120: done checking for max_fail_percentage 16380 1727204157.01121: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.01122: done checking to see if all hosts have failed 16380 1727204157.01123: getting the remaining hosts for this loop 16380 1727204157.01124: done getting the remaining hosts for this loop 16380 1727204157.01128: getting the next task for host managed-node2 16380 1727204157.01134: done getting next task for host managed-node2 16380 1727204157.01138: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16380 1727204157.01140: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.01151: getting variables 16380 1727204157.01153: in VariableManager get_vars() 16380 1727204157.01202: Calling all_inventory to load vars for managed-node2 16380 1727204157.01205: Calling groups_inventory to load vars for managed-node2 16380 1727204157.01208: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.01222: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.01225: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.01228: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.02615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.04182: done with get_vars() 16380 1727204157.04206: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.870) 0:00:18.149 ***** 16380 1727204157.04278: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16380 1727204157.04280: Creating lock for fedora.linux_system_roles.network_state 16380 1727204157.04522: worker is 1 (out of 1 available) 16380 1727204157.04537: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16380 1727204157.04551: done queuing things up, now waiting for results queue to drain 16380 1727204157.04553: waiting for pending results... 16380 1727204157.04743: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16380 1727204157.04832: in run() - task 12b410aa-8751-749c-b6eb-000000000025 16380 1727204157.04845: variable 'ansible_search_path' from source: unknown 16380 1727204157.04849: variable 'ansible_search_path' from source: unknown 16380 1727204157.04890: calling self._execute() 16380 1727204157.04968: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.04975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.04985: variable 'omit' from source: magic vars 16380 1727204157.05327: variable 'ansible_distribution_major_version' from source: facts 16380 1727204157.05341: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204157.05458: variable 'network_state' from source: role '' defaults 16380 1727204157.05467: Evaluated conditional (network_state != {}): False 16380 1727204157.05470: when evaluation is False, skipping this task 16380 1727204157.05473: _execute() done 16380 1727204157.05478: dumping result to json 16380 1727204157.05482: done dumping result, returning 16380 1727204157.05492: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-749c-b6eb-000000000025] 16380 1727204157.05497: sending task result for task 12b410aa-8751-749c-b6eb-000000000025 16380 1727204157.05595: done sending task result for task 12b410aa-8751-749c-b6eb-000000000025 16380 1727204157.05598: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204157.05650: no more pending results, returning what we have 16380 1727204157.05653: results queue empty 16380 1727204157.05655: checking for any_errors_fatal 16380 1727204157.05664: done checking for any_errors_fatal 16380 1727204157.05665: checking for max_fail_percentage 16380 1727204157.05667: done checking for max_fail_percentage 16380 1727204157.05667: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.05668: done checking to see if all hosts have failed 16380 1727204157.05670: getting the remaining hosts for this loop 16380 1727204157.05671: done getting the remaining hosts for this loop 16380 1727204157.05675: getting the next task for host managed-node2 16380 1727204157.05681: done getting next task for host managed-node2 16380 1727204157.05685: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16380 1727204157.05687: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.05703: getting variables 16380 1727204157.05705: in VariableManager get_vars() 16380 1727204157.05739: Calling all_inventory to load vars for managed-node2 16380 1727204157.05742: Calling groups_inventory to load vars for managed-node2 16380 1727204157.05744: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.05754: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.05757: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.05761: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.06960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.08546: done with get_vars() 16380 1727204157.08568: done getting variables 16380 1727204157.08624: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.043) 0:00:18.193 ***** 16380 1727204157.08648: entering _queue_task() for managed-node2/debug 16380 1727204157.08891: worker is 1 (out of 1 available) 16380 1727204157.08906: exiting _queue_task() for managed-node2/debug 16380 1727204157.08919: done queuing things up, now waiting for results queue to drain 16380 1727204157.08921: waiting for pending results... 16380 1727204157.09105: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16380 1727204157.09186: in run() - task 12b410aa-8751-749c-b6eb-000000000026 16380 1727204157.09202: variable 'ansible_search_path' from source: unknown 16380 1727204157.09206: variable 'ansible_search_path' from source: unknown 16380 1727204157.09238: calling self._execute() 16380 1727204157.09321: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.09326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.09338: variable 'omit' from source: magic vars 16380 1727204157.09664: variable 'ansible_distribution_major_version' from source: facts 16380 1727204157.09674: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204157.09681: variable 'omit' from source: magic vars 16380 1727204157.09723: variable 'omit' from source: magic vars 16380 1727204157.09753: variable 'omit' from source: magic vars 16380 1727204157.09792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204157.09828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204157.09845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204157.09862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204157.09873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204157.09902: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204157.09905: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.09916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.09998: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204157.10005: Set connection var ansible_shell_executable to /bin/sh 16380 1727204157.10014: Set connection var ansible_connection to ssh 16380 1727204157.10020: Set connection var ansible_shell_type to sh 16380 1727204157.10026: Set connection var ansible_pipelining to False 16380 1727204157.10038: Set connection var ansible_timeout to 10 16380 1727204157.10057: variable 'ansible_shell_executable' from source: unknown 16380 1727204157.10060: variable 'ansible_connection' from source: unknown 16380 1727204157.10064: variable 'ansible_module_compression' from source: unknown 16380 1727204157.10066: variable 'ansible_shell_type' from source: unknown 16380 1727204157.10071: variable 'ansible_shell_executable' from source: unknown 16380 1727204157.10073: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.10079: variable 'ansible_pipelining' from source: unknown 16380 1727204157.10082: variable 'ansible_timeout' from source: unknown 16380 1727204157.10087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.10211: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204157.10220: variable 'omit' from source: magic vars 16380 1727204157.10226: starting attempt loop 16380 1727204157.10229: running the handler 16380 1727204157.10342: variable '__network_connections_result' from source: set_fact 16380 1727204157.10393: handler run complete 16380 1727204157.10414: attempt loop complete, returning result 16380 1727204157.10417: _execute() done 16380 1727204157.10420: dumping result to json 16380 1727204157.10423: done dumping result, returning 16380 1727204157.10431: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-749c-b6eb-000000000026] 16380 1727204157.10436: sending task result for task 12b410aa-8751-749c-b6eb-000000000026 16380 1727204157.10528: done sending task result for task 12b410aa-8751-749c-b6eb-000000000026 16380 1727204157.10531: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced (not-active)" ] } 16380 1727204157.10632: no more pending results, returning what we have 16380 1727204157.10635: results queue empty 16380 1727204157.10637: checking for any_errors_fatal 16380 1727204157.10643: done checking for any_errors_fatal 16380 1727204157.10644: checking for max_fail_percentage 16380 1727204157.10646: done checking for max_fail_percentage 16380 1727204157.10647: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.10648: done checking to see if all hosts have failed 16380 1727204157.10649: getting the remaining hosts for this loop 16380 1727204157.10650: done getting the remaining hosts for this loop 16380 1727204157.10654: getting the next task for host managed-node2 16380 1727204157.10659: done getting next task for host managed-node2 16380 1727204157.10663: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16380 1727204157.10665: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.10675: getting variables 16380 1727204157.10677: in VariableManager get_vars() 16380 1727204157.10715: Calling all_inventory to load vars for managed-node2 16380 1727204157.10718: Calling groups_inventory to load vars for managed-node2 16380 1727204157.10726: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.10734: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.10737: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.10739: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.12005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.13565: done with get_vars() 16380 1727204157.13587: done getting variables 16380 1727204157.13637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.050) 0:00:18.243 ***** 16380 1727204157.13664: entering _queue_task() for managed-node2/debug 16380 1727204157.13908: worker is 1 (out of 1 available) 16380 1727204157.13924: exiting _queue_task() for managed-node2/debug 16380 1727204157.13936: done queuing things up, now waiting for results queue to drain 16380 1727204157.13939: waiting for pending results... 16380 1727204157.14140: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16380 1727204157.14222: in run() - task 12b410aa-8751-749c-b6eb-000000000027 16380 1727204157.14235: variable 'ansible_search_path' from source: unknown 16380 1727204157.14240: variable 'ansible_search_path' from source: unknown 16380 1727204157.14272: calling self._execute() 16380 1727204157.14355: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.14360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.14370: variable 'omit' from source: magic vars 16380 1727204157.14692: variable 'ansible_distribution_major_version' from source: facts 16380 1727204157.14702: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204157.14713: variable 'omit' from source: magic vars 16380 1727204157.14745: variable 'omit' from source: magic vars 16380 1727204157.14780: variable 'omit' from source: magic vars 16380 1727204157.14818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204157.14850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204157.14867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204157.14887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204157.14898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204157.14927: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204157.14932: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.14934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.15020: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204157.15027: Set connection var ansible_shell_executable to /bin/sh 16380 1727204157.15034: Set connection var ansible_connection to ssh 16380 1727204157.15040: Set connection var ansible_shell_type to sh 16380 1727204157.15047: Set connection var ansible_pipelining to False 16380 1727204157.15056: Set connection var ansible_timeout to 10 16380 1727204157.15074: variable 'ansible_shell_executable' from source: unknown 16380 1727204157.15077: variable 'ansible_connection' from source: unknown 16380 1727204157.15081: variable 'ansible_module_compression' from source: unknown 16380 1727204157.15086: variable 'ansible_shell_type' from source: unknown 16380 1727204157.15090: variable 'ansible_shell_executable' from source: unknown 16380 1727204157.15095: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.15101: variable 'ansible_pipelining' from source: unknown 16380 1727204157.15104: variable 'ansible_timeout' from source: unknown 16380 1727204157.15118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.15233: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204157.15244: variable 'omit' from source: magic vars 16380 1727204157.15250: starting attempt loop 16380 1727204157.15253: running the handler 16380 1727204157.15299: variable '__network_connections_result' from source: set_fact 16380 1727204157.15371: variable '__network_connections_result' from source: set_fact 16380 1727204157.15481: handler run complete 16380 1727204157.15507: attempt loop complete, returning result 16380 1727204157.15513: _execute() done 16380 1727204157.15516: dumping result to json 16380 1727204157.15519: done dumping result, returning 16380 1727204157.15528: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-749c-b6eb-000000000027] 16380 1727204157.15533: sending task result for task 12b410aa-8751-749c-b6eb-000000000027 16380 1727204157.15635: done sending task result for task 12b410aa-8751-749c-b6eb-000000000027 16380 1727204157.15637: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 528bfc52-b65b-476e-8410-3e9f9aa7eced (not-active)" ] } } 16380 1727204157.15740: no more pending results, returning what we have 16380 1727204157.15743: results queue empty 16380 1727204157.15745: checking for any_errors_fatal 16380 1727204157.15749: done checking for any_errors_fatal 16380 1727204157.15750: checking for max_fail_percentage 16380 1727204157.15752: done checking for max_fail_percentage 16380 1727204157.15752: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.15753: done checking to see if all hosts have failed 16380 1727204157.15754: getting the remaining hosts for this loop 16380 1727204157.15756: done getting the remaining hosts for this loop 16380 1727204157.15759: getting the next task for host managed-node2 16380 1727204157.15765: done getting next task for host managed-node2 16380 1727204157.15769: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16380 1727204157.15771: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.15781: getting variables 16380 1727204157.15782: in VariableManager get_vars() 16380 1727204157.15828: Calling all_inventory to load vars for managed-node2 16380 1727204157.15831: Calling groups_inventory to load vars for managed-node2 16380 1727204157.15834: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.15844: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.15846: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.15848: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.17126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.18694: done with get_vars() 16380 1727204157.18715: done getting variables 16380 1727204157.18765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.051) 0:00:18.294 ***** 16380 1727204157.18788: entering _queue_task() for managed-node2/debug 16380 1727204157.19010: worker is 1 (out of 1 available) 16380 1727204157.19025: exiting _queue_task() for managed-node2/debug 16380 1727204157.19037: done queuing things up, now waiting for results queue to drain 16380 1727204157.19040: waiting for pending results... 16380 1727204157.19229: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16380 1727204157.19308: in run() - task 12b410aa-8751-749c-b6eb-000000000028 16380 1727204157.19324: variable 'ansible_search_path' from source: unknown 16380 1727204157.19328: variable 'ansible_search_path' from source: unknown 16380 1727204157.19357: calling self._execute() 16380 1727204157.19438: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.19444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.19454: variable 'omit' from source: magic vars 16380 1727204157.19788: variable 'ansible_distribution_major_version' from source: facts 16380 1727204157.19800: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204157.19913: variable 'network_state' from source: role '' defaults 16380 1727204157.19926: Evaluated conditional (network_state != {}): False 16380 1727204157.19934: when evaluation is False, skipping this task 16380 1727204157.19938: _execute() done 16380 1727204157.19941: dumping result to json 16380 1727204157.19944: done dumping result, returning 16380 1727204157.19947: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-749c-b6eb-000000000028] 16380 1727204157.19956: sending task result for task 12b410aa-8751-749c-b6eb-000000000028 16380 1727204157.20047: done sending task result for task 12b410aa-8751-749c-b6eb-000000000028 16380 1727204157.20051: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16380 1727204157.20112: no more pending results, returning what we have 16380 1727204157.20116: results queue empty 16380 1727204157.20117: checking for any_errors_fatal 16380 1727204157.20124: done checking for any_errors_fatal 16380 1727204157.20125: checking for max_fail_percentage 16380 1727204157.20126: done checking for max_fail_percentage 16380 1727204157.20127: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.20128: done checking to see if all hosts have failed 16380 1727204157.20129: getting the remaining hosts for this loop 16380 1727204157.20130: done getting the remaining hosts for this loop 16380 1727204157.20133: getting the next task for host managed-node2 16380 1727204157.20139: done getting next task for host managed-node2 16380 1727204157.20142: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16380 1727204157.20145: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.20158: getting variables 16380 1727204157.20167: in VariableManager get_vars() 16380 1727204157.20205: Calling all_inventory to load vars for managed-node2 16380 1727204157.20208: Calling groups_inventory to load vars for managed-node2 16380 1727204157.20214: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.20223: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.20226: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.20230: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.21430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.23028: done with get_vars() 16380 1727204157.23058: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:57 -0400 (0:00:00.043) 0:00:18.338 ***** 16380 1727204157.23152: entering _queue_task() for managed-node2/ping 16380 1727204157.23154: Creating lock for ping 16380 1727204157.23449: worker is 1 (out of 1 available) 16380 1727204157.23464: exiting _queue_task() for managed-node2/ping 16380 1727204157.23478: done queuing things up, now waiting for results queue to drain 16380 1727204157.23480: waiting for pending results... 16380 1727204157.23678: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16380 1727204157.23759: in run() - task 12b410aa-8751-749c-b6eb-000000000029 16380 1727204157.23773: variable 'ansible_search_path' from source: unknown 16380 1727204157.23776: variable 'ansible_search_path' from source: unknown 16380 1727204157.23816: calling self._execute() 16380 1727204157.23895: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.23902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.23915: variable 'omit' from source: magic vars 16380 1727204157.24256: variable 'ansible_distribution_major_version' from source: facts 16380 1727204157.24268: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204157.24272: variable 'omit' from source: magic vars 16380 1727204157.24315: variable 'omit' from source: magic vars 16380 1727204157.24346: variable 'omit' from source: magic vars 16380 1727204157.24403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204157.24498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204157.24502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204157.24506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204157.24511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204157.24699: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204157.24702: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.24705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.24832: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204157.24838: Set connection var ansible_shell_executable to /bin/sh 16380 1727204157.24846: Set connection var ansible_connection to ssh 16380 1727204157.24855: Set connection var ansible_shell_type to sh 16380 1727204157.24863: Set connection var ansible_pipelining to False 16380 1727204157.24874: Set connection var ansible_timeout to 10 16380 1727204157.24906: variable 'ansible_shell_executable' from source: unknown 16380 1727204157.24916: variable 'ansible_connection' from source: unknown 16380 1727204157.24920: variable 'ansible_module_compression' from source: unknown 16380 1727204157.24923: variable 'ansible_shell_type' from source: unknown 16380 1727204157.24944: variable 'ansible_shell_executable' from source: unknown 16380 1727204157.24947: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204157.24949: variable 'ansible_pipelining' from source: unknown 16380 1727204157.24952: variable 'ansible_timeout' from source: unknown 16380 1727204157.24954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204157.25248: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204157.25261: variable 'omit' from source: magic vars 16380 1727204157.25266: starting attempt loop 16380 1727204157.25272: running the handler 16380 1727204157.25275: _low_level_execute_command(): starting 16380 1727204157.25277: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204157.25978: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204157.25994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204157.26008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.26103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204157.26114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204157.26117: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204157.26122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.26125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204157.26128: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204157.26130: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16380 1727204157.26137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204157.26140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.26143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204157.26151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204157.26155: stderr chunk (state=3): >>>debug2: match found <<< 16380 1727204157.26168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.26249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204157.26299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204157.26369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.28171: stdout chunk (state=3): >>>/root <<< 16380 1727204157.28325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204157.28383: stderr chunk (state=3): >>><<< 16380 1727204157.28399: stdout chunk (state=3): >>><<< 16380 1727204157.28453: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204157.28574: _low_level_execute_command(): starting 16380 1727204157.28578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587 `" && echo ansible-tmp-1727204157.2846105-17831-120891970351587="` echo /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587 `" ) && sleep 0' 16380 1727204157.29017: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.29055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204157.29059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204157.29062: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204157.29065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.29125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204157.29132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204157.29182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.31285: stdout chunk (state=3): >>>ansible-tmp-1727204157.2846105-17831-120891970351587=/root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587 <<< 16380 1727204157.31400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204157.31466: stderr chunk (state=3): >>><<< 16380 1727204157.31470: stdout chunk (state=3): >>><<< 16380 1727204157.31487: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204157.2846105-17831-120891970351587=/root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204157.31536: variable 'ansible_module_compression' from source: unknown 16380 1727204157.31585: ANSIBALLZ: Using lock for ping 16380 1727204157.31588: ANSIBALLZ: Acquiring lock 16380 1727204157.31593: ANSIBALLZ: Lock acquired: 140602933406240 16380 1727204157.31597: ANSIBALLZ: Creating module 16380 1727204157.43177: ANSIBALLZ: Writing module into payload 16380 1727204157.43182: ANSIBALLZ: Writing module 16380 1727204157.43196: ANSIBALLZ: Renaming module 16380 1727204157.43206: ANSIBALLZ: Done creating module 16380 1727204157.43223: variable 'ansible_facts' from source: unknown 16380 1727204157.43297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py 16380 1727204157.43544: Sending initial data 16380 1727204157.43547: Sent initial data (153 bytes) 16380 1727204157.44324: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.44441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204157.44458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204157.44541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.46293: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204157.46323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204157.46361: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpyl7hbwsc /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py <<< 16380 1727204157.46367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py" <<< 16380 1727204157.46398: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpyl7hbwsc" to remote "/root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py" <<< 16380 1727204157.46407: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py" <<< 16380 1727204157.47668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204157.48020: stderr chunk (state=3): >>><<< 16380 1727204157.48030: stdout chunk (state=3): >>><<< 16380 1727204157.48162: done transferring module to remote 16380 1727204157.48165: _low_level_execute_command(): starting 16380 1727204157.48168: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/ /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py && sleep 0' 16380 1727204157.49283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.49311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204157.49329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204157.49413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.51399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204157.51432: stderr chunk (state=3): >>><<< 16380 1727204157.51434: stdout chunk (state=3): >>><<< 16380 1727204157.51446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204157.51496: _low_level_execute_command(): starting 16380 1727204157.51500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/AnsiballZ_ping.py && sleep 0' 16380 1727204157.52031: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.52035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.52037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204157.52040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.52044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.52114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204157.52118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204157.52164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.69969: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16380 1727204157.71605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204157.71609: stderr chunk (state=3): >>><<< 16380 1727204157.71611: stdout chunk (state=3): >>><<< 16380 1727204157.71614: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204157.71617: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204157.71622: _low_level_execute_command(): starting 16380 1727204157.71629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204157.2846105-17831-120891970351587/ > /dev/null 2>&1 && sleep 0' 16380 1727204157.72242: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204157.72260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204157.72264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.72279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204157.72294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204157.72304: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204157.72319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.72334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204157.72344: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204157.72352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16380 1727204157.72367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204157.72370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204157.72385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204157.72396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204157.72404: stderr chunk (state=3): >>>debug2: match found <<< 16380 1727204157.72419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204157.72491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204157.72507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204157.72528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204157.72596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204157.74939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204157.74942: stdout chunk (state=3): >>><<< 16380 1727204157.74945: stderr chunk (state=3): >>><<< 16380 1727204157.74955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204157.74958: handler run complete 16380 1727204157.74960: attempt loop complete, returning result 16380 1727204157.74966: _execute() done 16380 1727204157.74968: dumping result to json 16380 1727204157.74975: done dumping result, returning 16380 1727204157.74994: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-749c-b6eb-000000000029] 16380 1727204157.74997: sending task result for task 12b410aa-8751-749c-b6eb-000000000029 ok: [managed-node2] => { "changed": false, "ping": "pong" } 16380 1727204157.75282: no more pending results, returning what we have 16380 1727204157.75285: results queue empty 16380 1727204157.75287: checking for any_errors_fatal 16380 1727204157.75293: done checking for any_errors_fatal 16380 1727204157.75294: checking for max_fail_percentage 16380 1727204157.75296: done checking for max_fail_percentage 16380 1727204157.75297: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.75298: done checking to see if all hosts have failed 16380 1727204157.75301: getting the remaining hosts for this loop 16380 1727204157.75306: done getting the remaining hosts for this loop 16380 1727204157.75310: getting the next task for host managed-node2 16380 1727204157.75319: done getting next task for host managed-node2 16380 1727204157.75323: ^ task is: TASK: meta (role_complete) 16380 1727204157.75326: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.75338: getting variables 16380 1727204157.75340: in VariableManager get_vars() 16380 1727204157.75503: Calling all_inventory to load vars for managed-node2 16380 1727204157.75507: Calling groups_inventory to load vars for managed-node2 16380 1727204157.75510: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.75523: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.75527: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.75530: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.76268: done sending task result for task 12b410aa-8751-749c-b6eb-000000000029 16380 1727204157.76272: WORKER PROCESS EXITING 16380 1727204157.80700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.88393: done with get_vars() 16380 1727204157.88438: done getting variables 16380 1727204157.88662: done queuing things up, now waiting for results queue to drain 16380 1727204157.88665: results queue empty 16380 1727204157.88666: checking for any_errors_fatal 16380 1727204157.88670: done checking for any_errors_fatal 16380 1727204157.88672: checking for max_fail_percentage 16380 1727204157.88673: done checking for max_fail_percentage 16380 1727204157.88674: checking to see if all hosts have failed and the running result is not ok 16380 1727204157.88675: done checking to see if all hosts have failed 16380 1727204157.88676: getting the remaining hosts for this loop 16380 1727204157.88678: done getting the remaining hosts for this loop 16380 1727204157.88681: getting the next task for host managed-node2 16380 1727204157.88686: done getting next task for host managed-node2 16380 1727204157.88688: ^ task is: TASK: meta (flush_handlers) 16380 1727204157.88794: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204157.88799: getting variables 16380 1727204157.88800: in VariableManager get_vars() 16380 1727204157.88822: Calling all_inventory to load vars for managed-node2 16380 1727204157.88825: Calling groups_inventory to load vars for managed-node2 16380 1727204157.88892: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.88901: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.88905: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.88909: Calling groups_plugins_play to load vars for managed-node2 16380 1727204157.93664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204157.98388: done with get_vars() 16380 1727204157.98439: done getting variables 16380 1727204157.98639: in VariableManager get_vars() 16380 1727204157.98657: Calling all_inventory to load vars for managed-node2 16380 1727204157.98660: Calling groups_inventory to load vars for managed-node2 16380 1727204157.98663: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204157.98669: Calling all_plugins_play to load vars for managed-node2 16380 1727204157.98672: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204157.98676: Calling groups_plugins_play to load vars for managed-node2 16380 1727204158.01235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204158.07933: done with get_vars() 16380 1727204158.07986: done queuing things up, now waiting for results queue to drain 16380 1727204158.07988: results queue empty 16380 1727204158.08093: checking for any_errors_fatal 16380 1727204158.08095: done checking for any_errors_fatal 16380 1727204158.08096: checking for max_fail_percentage 16380 1727204158.08098: done checking for max_fail_percentage 16380 1727204158.08099: checking to see if all hosts have failed and the running result is not ok 16380 1727204158.08100: done checking to see if all hosts have failed 16380 1727204158.08101: getting the remaining hosts for this loop 16380 1727204158.08102: done getting the remaining hosts for this loop 16380 1727204158.08106: getting the next task for host managed-node2 16380 1727204158.08116: done getting next task for host managed-node2 16380 1727204158.08118: ^ task is: TASK: meta (flush_handlers) 16380 1727204158.08125: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204158.08134: getting variables 16380 1727204158.08136: in VariableManager get_vars() 16380 1727204158.08153: Calling all_inventory to load vars for managed-node2 16380 1727204158.08156: Calling groups_inventory to load vars for managed-node2 16380 1727204158.08159: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204158.08166: Calling all_plugins_play to load vars for managed-node2 16380 1727204158.08169: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204158.08173: Calling groups_plugins_play to load vars for managed-node2 16380 1727204158.23047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204158.28637: done with get_vars() 16380 1727204158.28678: done getting variables 16380 1727204158.29101: in VariableManager get_vars() 16380 1727204158.29118: Calling all_inventory to load vars for managed-node2 16380 1727204158.29121: Calling groups_inventory to load vars for managed-node2 16380 1727204158.29124: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204158.29130: Calling all_plugins_play to load vars for managed-node2 16380 1727204158.29133: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204158.29137: Calling groups_plugins_play to load vars for managed-node2 16380 1727204158.36908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204158.42974: done with get_vars() 16380 1727204158.43025: done queuing things up, now waiting for results queue to drain 16380 1727204158.43028: results queue empty 16380 1727204158.43029: checking for any_errors_fatal 16380 1727204158.43031: done checking for any_errors_fatal 16380 1727204158.43032: checking for max_fail_percentage 16380 1727204158.43033: done checking for max_fail_percentage 16380 1727204158.43034: checking to see if all hosts have failed and the running result is not ok 16380 1727204158.43035: done checking to see if all hosts have failed 16380 1727204158.43036: getting the remaining hosts for this loop 16380 1727204158.43038: done getting the remaining hosts for this loop 16380 1727204158.43041: getting the next task for host managed-node2 16380 1727204158.43045: done getting next task for host managed-node2 16380 1727204158.43046: ^ task is: None 16380 1727204158.43048: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204158.43050: done queuing things up, now waiting for results queue to drain 16380 1727204158.43051: results queue empty 16380 1727204158.43052: checking for any_errors_fatal 16380 1727204158.43053: done checking for any_errors_fatal 16380 1727204158.43054: checking for max_fail_percentage 16380 1727204158.43055: done checking for max_fail_percentage 16380 1727204158.43056: checking to see if all hosts have failed and the running result is not ok 16380 1727204158.43057: done checking to see if all hosts have failed 16380 1727204158.43058: getting the next task for host managed-node2 16380 1727204158.43061: done getting next task for host managed-node2 16380 1727204158.43062: ^ task is: None 16380 1727204158.43064: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204158.43117: in VariableManager get_vars() 16380 1727204158.43138: done with get_vars() 16380 1727204158.43146: in VariableManager get_vars() 16380 1727204158.43159: done with get_vars() 16380 1727204158.43164: variable 'omit' from source: magic vars 16380 1727204158.43292: variable 'task' from source: play vars 16380 1727204158.43332: in VariableManager get_vars() 16380 1727204158.43345: done with get_vars() 16380 1727204158.43366: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 16380 1727204158.43673: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204158.43696: getting the remaining hosts for this loop 16380 1727204158.43697: done getting the remaining hosts for this loop 16380 1727204158.43700: getting the next task for host managed-node2 16380 1727204158.43703: done getting next task for host managed-node2 16380 1727204158.43705: ^ task is: TASK: Gathering Facts 16380 1727204158.43707: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204158.43711: getting variables 16380 1727204158.43713: in VariableManager get_vars() 16380 1727204158.43726: Calling all_inventory to load vars for managed-node2 16380 1727204158.43729: Calling groups_inventory to load vars for managed-node2 16380 1727204158.43736: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204158.43742: Calling all_plugins_play to load vars for managed-node2 16380 1727204158.43745: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204158.43749: Calling groups_plugins_play to load vars for managed-node2 16380 1727204158.45987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204158.49203: done with get_vars() 16380 1727204158.49238: done getting variables 16380 1727204158.49309: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:55:58 -0400 (0:00:01.261) 0:00:19.599 ***** 16380 1727204158.49339: entering _queue_task() for managed-node2/gather_facts 16380 1727204158.49729: worker is 1 (out of 1 available) 16380 1727204158.49744: exiting _queue_task() for managed-node2/gather_facts 16380 1727204158.49757: done queuing things up, now waiting for results queue to drain 16380 1727204158.49759: waiting for pending results... 16380 1727204158.50119: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204158.50233: in run() - task 12b410aa-8751-749c-b6eb-000000000219 16380 1727204158.50244: variable 'ansible_search_path' from source: unknown 16380 1727204158.50288: calling self._execute() 16380 1727204158.50434: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204158.50438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204158.50441: variable 'omit' from source: magic vars 16380 1727204158.50881: variable 'ansible_distribution_major_version' from source: facts 16380 1727204158.50904: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204158.50916: variable 'omit' from source: magic vars 16380 1727204158.50978: variable 'omit' from source: magic vars 16380 1727204158.51000: variable 'omit' from source: magic vars 16380 1727204158.51047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204158.51098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204158.51130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204158.51197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204158.51200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204158.51218: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204158.51228: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204158.51237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204158.51360: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204158.51375: Set connection var ansible_shell_executable to /bin/sh 16380 1727204158.51416: Set connection var ansible_connection to ssh 16380 1727204158.51419: Set connection var ansible_shell_type to sh 16380 1727204158.51422: Set connection var ansible_pipelining to False 16380 1727204158.51437: Set connection var ansible_timeout to 10 16380 1727204158.51468: variable 'ansible_shell_executable' from source: unknown 16380 1727204158.51476: variable 'ansible_connection' from source: unknown 16380 1727204158.51494: variable 'ansible_module_compression' from source: unknown 16380 1727204158.51497: variable 'ansible_shell_type' from source: unknown 16380 1727204158.51499: variable 'ansible_shell_executable' from source: unknown 16380 1727204158.51523: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204158.51526: variable 'ansible_pipelining' from source: unknown 16380 1727204158.51528: variable 'ansible_timeout' from source: unknown 16380 1727204158.51530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204158.51745: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204158.51765: variable 'omit' from source: magic vars 16380 1727204158.51803: starting attempt loop 16380 1727204158.51807: running the handler 16380 1727204158.51809: variable 'ansible_facts' from source: unknown 16380 1727204158.51830: _low_level_execute_command(): starting 16380 1727204158.51842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204158.52625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204158.52709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204158.52761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204158.52785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204158.52805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204158.52924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204158.54747: stdout chunk (state=3): >>>/root <<< 16380 1727204158.54906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204158.54950: stderr chunk (state=3): >>><<< 16380 1727204158.54963: stdout chunk (state=3): >>><<< 16380 1727204158.54994: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204158.55095: _low_level_execute_command(): starting 16380 1727204158.55099: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054 `" && echo ansible-tmp-1727204158.5500147-17928-220336034251054="` echo /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054 `" ) && sleep 0' 16380 1727204158.55668: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204158.55693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204158.55712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204158.55738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204158.55757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204158.55771: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204158.55872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204158.55902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204158.55986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204158.58064: stdout chunk (state=3): >>>ansible-tmp-1727204158.5500147-17928-220336034251054=/root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054 <<< 16380 1727204158.58169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204158.58246: stderr chunk (state=3): >>><<< 16380 1727204158.58249: stdout chunk (state=3): >>><<< 16380 1727204158.58265: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204158.5500147-17928-220336034251054=/root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204158.58293: variable 'ansible_module_compression' from source: unknown 16380 1727204158.58339: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204158.58404: variable 'ansible_facts' from source: unknown 16380 1727204158.58552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py 16380 1727204158.58664: Sending initial data 16380 1727204158.58668: Sent initial data (154 bytes) 16380 1727204158.59118: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204158.59122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204158.59125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204158.59127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204158.59186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204158.59191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204158.59228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204158.61044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204158.61071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py" <<< 16380 1727204158.61095: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpus8_y43y /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py <<< 16380 1727204158.61140: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpus8_y43y" to remote "/root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py" <<< 16380 1727204158.63993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204158.64021: stderr chunk (state=3): >>><<< 16380 1727204158.64025: stdout chunk (state=3): >>><<< 16380 1727204158.64050: done transferring module to remote 16380 1727204158.64060: _low_level_execute_command(): starting 16380 1727204158.64066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/ /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py && sleep 0' 16380 1727204158.64627: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204158.64655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204158.64722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204158.66683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204158.66729: stderr chunk (state=3): >>><<< 16380 1727204158.66733: stdout chunk (state=3): >>><<< 16380 1727204158.66747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204158.66754: _low_level_execute_command(): starting 16380 1727204158.66757: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/AnsiballZ_setup.py && sleep 0' 16380 1727204158.67188: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204158.67194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204158.67197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204158.67199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204158.67203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204158.67254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204158.67260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204158.67302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204159.36866: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "58", "epoch": "1727204158", "epoch_int": "1727204158", "date": "2024-09-24", "time": "14:55:58", "iso8601_micro": "2024-09-24T18:55:58.987260Z", "iso8601": "2024-09-24T18:55:58Z", "iso8601_basic": "20240924T145558987260", "iso8601_basic_short": "20240924T145558", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "", "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.46875, "5m": 0.5302734375, "15m": 0.34033203125}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:07:3c:a4:a3:3f", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2849, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 868, "free": 2849}, "nocache": {"free": 3477, "used": 240}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 663, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155996672, "block_size": 4096, "block_total": 64479564, "block_available": 61317382, "block_used": 3162182, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204159.39111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204159.39116: stdout chunk (state=3): >>><<< 16380 1727204159.39119: stderr chunk (state=3): >>><<< 16380 1727204159.39296: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "58", "epoch": "1727204158", "epoch_int": "1727204158", "date": "2024-09-24", "time": "14:55:58", "iso8601_micro": "2024-09-24T18:55:58.987260Z", "iso8601": "2024-09-24T18:55:58Z", "iso8601_basic": "20240924T145558987260", "iso8601_basic_short": "20240924T145558", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "", "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_loadavg": {"1m": 0.46875, "5m": 0.5302734375, "15m": 0.34033203125}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:07:3c:a4:a3:3f", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2849, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 868, "free": 2849}, "nocache": {"free": 3477, "used": 240}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 663, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155996672, "block_size": 4096, "block_total": 64479564, "block_available": 61317382, "block_used": 3162182, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204159.39654: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204159.39673: _low_level_execute_command(): starting 16380 1727204159.39683: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204158.5500147-17928-220336034251054/ > /dev/null 2>&1 && sleep 0' 16380 1727204159.40447: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204159.40463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204159.40478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204159.40501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204159.40615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204159.41009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204159.41300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204159.43552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204159.43555: stdout chunk (state=3): >>><<< 16380 1727204159.43558: stderr chunk (state=3): >>><<< 16380 1727204159.43560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204159.43563: handler run complete 16380 1727204159.44206: variable 'ansible_facts' from source: unknown 16380 1727204159.44531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.46564: variable 'ansible_facts' from source: unknown 16380 1727204159.46800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.47222: attempt loop complete, returning result 16380 1727204159.47300: _execute() done 16380 1727204159.47309: dumping result to json 16380 1727204159.47353: done dumping result, returning 16380 1727204159.47367: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-000000000219] 16380 1727204159.47406: sending task result for task 12b410aa-8751-749c-b6eb-000000000219 ok: [managed-node2] 16380 1727204159.49265: done sending task result for task 12b410aa-8751-749c-b6eb-000000000219 16380 1727204159.49268: WORKER PROCESS EXITING 16380 1727204159.49281: no more pending results, returning what we have 16380 1727204159.49284: results queue empty 16380 1727204159.49285: checking for any_errors_fatal 16380 1727204159.49287: done checking for any_errors_fatal 16380 1727204159.49287: checking for max_fail_percentage 16380 1727204159.49291: done checking for max_fail_percentage 16380 1727204159.49293: checking to see if all hosts have failed and the running result is not ok 16380 1727204159.49294: done checking to see if all hosts have failed 16380 1727204159.49294: getting the remaining hosts for this loop 16380 1727204159.49296: done getting the remaining hosts for this loop 16380 1727204159.49300: getting the next task for host managed-node2 16380 1727204159.49305: done getting next task for host managed-node2 16380 1727204159.49307: ^ task is: TASK: meta (flush_handlers) 16380 1727204159.49309: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204159.49313: getting variables 16380 1727204159.49314: in VariableManager get_vars() 16380 1727204159.49337: Calling all_inventory to load vars for managed-node2 16380 1727204159.49340: Calling groups_inventory to load vars for managed-node2 16380 1727204159.49343: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204159.49355: Calling all_plugins_play to load vars for managed-node2 16380 1727204159.49358: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204159.49362: Calling groups_plugins_play to load vars for managed-node2 16380 1727204159.53647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.58352: done with get_vars() 16380 1727204159.58701: done getting variables 16380 1727204159.58791: in VariableManager get_vars() 16380 1727204159.58805: Calling all_inventory to load vars for managed-node2 16380 1727204159.58809: Calling groups_inventory to load vars for managed-node2 16380 1727204159.58812: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204159.58819: Calling all_plugins_play to load vars for managed-node2 16380 1727204159.58822: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204159.58825: Calling groups_plugins_play to load vars for managed-node2 16380 1727204159.63252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.69577: done with get_vars() 16380 1727204159.69629: done queuing things up, now waiting for results queue to drain 16380 1727204159.69632: results queue empty 16380 1727204159.69633: checking for any_errors_fatal 16380 1727204159.69640: done checking for any_errors_fatal 16380 1727204159.69641: checking for max_fail_percentage 16380 1727204159.69642: done checking for max_fail_percentage 16380 1727204159.69643: checking to see if all hosts have failed and the running result is not ok 16380 1727204159.69649: done checking to see if all hosts have failed 16380 1727204159.69650: getting the remaining hosts for this loop 16380 1727204159.69651: done getting the remaining hosts for this loop 16380 1727204159.69655: getting the next task for host managed-node2 16380 1727204159.69660: done getting next task for host managed-node2 16380 1727204159.69663: ^ task is: TASK: Include the task '{{ task }}' 16380 1727204159.69665: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204159.69668: getting variables 16380 1727204159.69669: in VariableManager get_vars() 16380 1727204159.69682: Calling all_inventory to load vars for managed-node2 16380 1727204159.69685: Calling groups_inventory to load vars for managed-node2 16380 1727204159.69688: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204159.69901: Calling all_plugins_play to load vars for managed-node2 16380 1727204159.69904: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204159.69908: Calling groups_plugins_play to load vars for managed-node2 16380 1727204159.73950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.80026: done with get_vars() 16380 1727204159.80059: done getting variables 16380 1727204159.80259: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:55:59 -0400 (0:00:01.309) 0:00:20.909 ***** 16380 1727204159.80292: entering _queue_task() for managed-node2/include_tasks 16380 1727204159.80668: worker is 1 (out of 1 available) 16380 1727204159.80684: exiting _queue_task() for managed-node2/include_tasks 16380 1727204159.80697: done queuing things up, now waiting for results queue to drain 16380 1727204159.80699: waiting for pending results... 16380 1727204159.80999: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_present.yml' 16380 1727204159.81135: in run() - task 12b410aa-8751-749c-b6eb-00000000002d 16380 1727204159.81159: variable 'ansible_search_path' from source: unknown 16380 1727204159.81206: calling self._execute() 16380 1727204159.81308: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204159.81325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204159.81346: variable 'omit' from source: magic vars 16380 1727204159.81794: variable 'ansible_distribution_major_version' from source: facts 16380 1727204159.81813: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204159.81824: variable 'task' from source: play vars 16380 1727204159.81922: variable 'task' from source: play vars 16380 1727204159.81936: _execute() done 16380 1727204159.81945: dumping result to json 16380 1727204159.81955: done dumping result, returning 16380 1727204159.81966: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_present.yml' [12b410aa-8751-749c-b6eb-00000000002d] 16380 1727204159.81980: sending task result for task 12b410aa-8751-749c-b6eb-00000000002d 16380 1727204159.82165: done sending task result for task 12b410aa-8751-749c-b6eb-00000000002d 16380 1727204159.82168: WORKER PROCESS EXITING 16380 1727204159.82200: no more pending results, returning what we have 16380 1727204159.82206: in VariableManager get_vars() 16380 1727204159.82246: Calling all_inventory to load vars for managed-node2 16380 1727204159.82250: Calling groups_inventory to load vars for managed-node2 16380 1727204159.82254: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204159.82272: Calling all_plugins_play to load vars for managed-node2 16380 1727204159.82276: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204159.82281: Calling groups_plugins_play to load vars for managed-node2 16380 1727204159.84595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.87463: done with get_vars() 16380 1727204159.87498: variable 'ansible_search_path' from source: unknown 16380 1727204159.87518: we have included files to process 16380 1727204159.87519: generating all_blocks data 16380 1727204159.87521: done generating all_blocks data 16380 1727204159.87522: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 16380 1727204159.87524: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 16380 1727204159.87527: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 16380 1727204159.87726: in VariableManager get_vars() 16380 1727204159.87746: done with get_vars() 16380 1727204159.87882: done processing included file 16380 1727204159.87885: iterating over new_blocks loaded from include file 16380 1727204159.87887: in VariableManager get_vars() 16380 1727204159.87904: done with get_vars() 16380 1727204159.87905: filtering new block on tags 16380 1727204159.87930: done filtering new block on tags 16380 1727204159.87933: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 16380 1727204159.87939: extending task lists for all hosts with included blocks 16380 1727204159.87979: done extending task lists 16380 1727204159.87980: done processing included files 16380 1727204159.87981: results queue empty 16380 1727204159.87982: checking for any_errors_fatal 16380 1727204159.87984: done checking for any_errors_fatal 16380 1727204159.87985: checking for max_fail_percentage 16380 1727204159.87986: done checking for max_fail_percentage 16380 1727204159.87988: checking to see if all hosts have failed and the running result is not ok 16380 1727204159.87990: done checking to see if all hosts have failed 16380 1727204159.87991: getting the remaining hosts for this loop 16380 1727204159.87993: done getting the remaining hosts for this loop 16380 1727204159.87996: getting the next task for host managed-node2 16380 1727204159.88000: done getting next task for host managed-node2 16380 1727204159.88003: ^ task is: TASK: Include the task 'get_interface_stat.yml' 16380 1727204159.88005: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204159.88008: getting variables 16380 1727204159.88011: in VariableManager get_vars() 16380 1727204159.88021: Calling all_inventory to load vars for managed-node2 16380 1727204159.88024: Calling groups_inventory to load vars for managed-node2 16380 1727204159.88027: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204159.88033: Calling all_plugins_play to load vars for managed-node2 16380 1727204159.88037: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204159.88040: Calling groups_plugins_play to load vars for managed-node2 16380 1727204159.90132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204159.93270: done with get_vars() 16380 1727204159.93307: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.131) 0:00:21.040 ***** 16380 1727204159.93403: entering _queue_task() for managed-node2/include_tasks 16380 1727204159.94225: worker is 1 (out of 1 available) 16380 1727204159.94238: exiting _queue_task() for managed-node2/include_tasks 16380 1727204159.94249: done queuing things up, now waiting for results queue to drain 16380 1727204159.94251: waiting for pending results... 16380 1727204159.94812: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 16380 1727204159.94852: in run() - task 12b410aa-8751-749c-b6eb-00000000022a 16380 1727204159.94913: variable 'ansible_search_path' from source: unknown 16380 1727204159.94924: variable 'ansible_search_path' from source: unknown 16380 1727204159.95151: calling self._execute() 16380 1727204159.95197: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204159.95272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204159.95294: variable 'omit' from source: magic vars 16380 1727204159.96257: variable 'ansible_distribution_major_version' from source: facts 16380 1727204159.96275: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204159.96286: _execute() done 16380 1727204159.96299: dumping result to json 16380 1727204159.96358: done dumping result, returning 16380 1727204159.96371: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-749c-b6eb-00000000022a] 16380 1727204159.96384: sending task result for task 12b410aa-8751-749c-b6eb-00000000022a 16380 1727204159.96708: done sending task result for task 12b410aa-8751-749c-b6eb-00000000022a 16380 1727204159.96713: WORKER PROCESS EXITING 16380 1727204159.96744: no more pending results, returning what we have 16380 1727204159.96750: in VariableManager get_vars() 16380 1727204159.96786: Calling all_inventory to load vars for managed-node2 16380 1727204159.96791: Calling groups_inventory to load vars for managed-node2 16380 1727204159.96795: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204159.96814: Calling all_plugins_play to load vars for managed-node2 16380 1727204159.96819: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204159.96824: Calling groups_plugins_play to load vars for managed-node2 16380 1727204160.00645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204160.07594: done with get_vars() 16380 1727204160.07628: variable 'ansible_search_path' from source: unknown 16380 1727204160.07629: variable 'ansible_search_path' from source: unknown 16380 1727204160.07642: variable 'task' from source: play vars 16380 1727204160.07776: variable 'task' from source: play vars 16380 1727204160.08033: we have included files to process 16380 1727204160.08035: generating all_blocks data 16380 1727204160.08037: done generating all_blocks data 16380 1727204160.08039: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204160.08040: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204160.08043: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204160.08476: done processing included file 16380 1727204160.08478: iterating over new_blocks loaded from include file 16380 1727204160.08480: in VariableManager get_vars() 16380 1727204160.08500: done with get_vars() 16380 1727204160.08502: filtering new block on tags 16380 1727204160.08522: done filtering new block on tags 16380 1727204160.08525: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 16380 1727204160.08530: extending task lists for all hosts with included blocks 16380 1727204160.08851: done extending task lists 16380 1727204160.08853: done processing included files 16380 1727204160.08854: results queue empty 16380 1727204160.08855: checking for any_errors_fatal 16380 1727204160.08859: done checking for any_errors_fatal 16380 1727204160.08860: checking for max_fail_percentage 16380 1727204160.08861: done checking for max_fail_percentage 16380 1727204160.08862: checking to see if all hosts have failed and the running result is not ok 16380 1727204160.08864: done checking to see if all hosts have failed 16380 1727204160.08864: getting the remaining hosts for this loop 16380 1727204160.08866: done getting the remaining hosts for this loop 16380 1727204160.08869: getting the next task for host managed-node2 16380 1727204160.08874: done getting next task for host managed-node2 16380 1727204160.08877: ^ task is: TASK: Get stat for interface {{ interface }} 16380 1727204160.08880: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204160.08882: getting variables 16380 1727204160.08884: in VariableManager get_vars() 16380 1727204160.08895: Calling all_inventory to load vars for managed-node2 16380 1727204160.08899: Calling groups_inventory to load vars for managed-node2 16380 1727204160.08902: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204160.08908: Calling all_plugins_play to load vars for managed-node2 16380 1727204160.08911: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204160.08915: Calling groups_plugins_play to load vars for managed-node2 16380 1727204160.12399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204160.19244: done with get_vars() 16380 1727204160.19470: done getting variables 16380 1727204160.19722: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.263) 0:00:21.304 ***** 16380 1727204160.19757: entering _queue_task() for managed-node2/stat 16380 1727204160.20472: worker is 1 (out of 1 available) 16380 1727204160.20485: exiting _queue_task() for managed-node2/stat 16380 1727204160.20601: done queuing things up, now waiting for results queue to drain 16380 1727204160.20603: waiting for pending results... 16380 1727204160.20908: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 16380 1727204160.20955: in run() - task 12b410aa-8751-749c-b6eb-000000000235 16380 1727204160.20978: variable 'ansible_search_path' from source: unknown 16380 1727204160.20986: variable 'ansible_search_path' from source: unknown 16380 1727204160.21037: calling self._execute() 16380 1727204160.21146: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204160.21160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204160.21178: variable 'omit' from source: magic vars 16380 1727204160.21629: variable 'ansible_distribution_major_version' from source: facts 16380 1727204160.21648: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204160.21661: variable 'omit' from source: magic vars 16380 1727204160.21732: variable 'omit' from source: magic vars 16380 1727204160.21897: variable 'interface' from source: set_fact 16380 1727204160.21900: variable 'omit' from source: magic vars 16380 1727204160.21938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204160.21984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204160.22016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204160.22043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204160.22061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204160.22115: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204160.22118: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204160.22194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204160.22257: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204160.22270: Set connection var ansible_shell_executable to /bin/sh 16380 1727204160.22282: Set connection var ansible_connection to ssh 16380 1727204160.22295: Set connection var ansible_shell_type to sh 16380 1727204160.22306: Set connection var ansible_pipelining to False 16380 1727204160.22320: Set connection var ansible_timeout to 10 16380 1727204160.22356: variable 'ansible_shell_executable' from source: unknown 16380 1727204160.22366: variable 'ansible_connection' from source: unknown 16380 1727204160.22374: variable 'ansible_module_compression' from source: unknown 16380 1727204160.22382: variable 'ansible_shell_type' from source: unknown 16380 1727204160.22392: variable 'ansible_shell_executable' from source: unknown 16380 1727204160.22401: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204160.22410: variable 'ansible_pipelining' from source: unknown 16380 1727204160.22418: variable 'ansible_timeout' from source: unknown 16380 1727204160.22428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204160.22681: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204160.22767: variable 'omit' from source: magic vars 16380 1727204160.22770: starting attempt loop 16380 1727204160.22773: running the handler 16380 1727204160.22775: _low_level_execute_command(): starting 16380 1727204160.22778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204160.23519: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204160.23543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204160.23613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204160.23679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204160.23711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204160.23780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204160.23803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204160.25701: stdout chunk (state=3): >>>/root <<< 16380 1727204160.25807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204160.25934: stderr chunk (state=3): >>><<< 16380 1727204160.25938: stdout chunk (state=3): >>><<< 16380 1727204160.26107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204160.26112: _low_level_execute_command(): starting 16380 1727204160.26116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905 `" && echo ansible-tmp-1727204160.2596173-18148-250642688437905="` echo /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905 `" ) && sleep 0' 16380 1727204160.27342: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204160.27705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204160.27717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204160.27740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204160.27812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204160.29919: stdout chunk (state=3): >>>ansible-tmp-1727204160.2596173-18148-250642688437905=/root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905 <<< 16380 1727204160.30032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204160.30122: stderr chunk (state=3): >>><<< 16380 1727204160.30306: stdout chunk (state=3): >>><<< 16380 1727204160.30495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204160.2596173-18148-250642688437905=/root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204160.30498: variable 'ansible_module_compression' from source: unknown 16380 1727204160.30501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16380 1727204160.30894: variable 'ansible_facts' from source: unknown 16380 1727204160.30898: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py 16380 1727204160.31248: Sending initial data 16380 1727204160.31251: Sent initial data (153 bytes) 16380 1727204160.32613: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204160.32666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204160.32706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204160.32867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204160.34636: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204160.34675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204160.34725: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpncynw34i /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py <<< 16380 1727204160.34738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py" <<< 16380 1727204160.34765: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpncynw34i" to remote "/root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py" <<< 16380 1727204160.36763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204160.37172: stderr chunk (state=3): >>><<< 16380 1727204160.37176: stdout chunk (state=3): >>><<< 16380 1727204160.37178: done transferring module to remote 16380 1727204160.37181: _low_level_execute_command(): starting 16380 1727204160.37184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/ /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py && sleep 0' 16380 1727204160.38361: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204160.38502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204160.38561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204160.38751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204160.40800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204160.40819: stdout chunk (state=3): >>><<< 16380 1727204160.40832: stderr chunk (state=3): >>><<< 16380 1727204160.40938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204160.40952: _low_level_execute_command(): starting 16380 1727204160.40962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/AnsiballZ_stat.py && sleep 0' 16380 1727204160.42153: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204160.42376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204160.42393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204160.42477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204160.60431: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36383, "dev": 23, "nlink": 1, "atime": 1727204156.8995714, "mtime": 1727204156.8995714, "ctime": 1727204156.8995714, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16380 1727204160.61931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204160.62099: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 16380 1727204160.62103: stdout chunk (state=3): >>><<< 16380 1727204160.62106: stderr chunk (state=3): >>><<< 16380 1727204160.62151: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36383, "dev": 23, "nlink": 1, "atime": 1727204156.8995714, "mtime": 1727204156.8995714, "ctime": 1727204156.8995714, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204160.62400: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204160.62404: _low_level_execute_command(): starting 16380 1727204160.62407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204160.2596173-18148-250642688437905/ > /dev/null 2>&1 && sleep 0' 16380 1727204160.63865: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204160.64030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204160.64054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204160.64138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204160.66213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204160.66332: stderr chunk (state=3): >>><<< 16380 1727204160.66344: stdout chunk (state=3): >>><<< 16380 1727204160.66497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204160.66505: handler run complete 16380 1727204160.66634: attempt loop complete, returning result 16380 1727204160.66643: _execute() done 16380 1727204160.66651: dumping result to json 16380 1727204160.66662: done dumping result, returning 16380 1727204160.66677: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000235] 16380 1727204160.66686: sending task result for task 12b410aa-8751-749c-b6eb-000000000235 16380 1727204160.67031: done sending task result for task 12b410aa-8751-749c-b6eb-000000000235 16380 1727204160.67034: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204156.8995714, "block_size": 4096, "blocks": 0, "ctime": 1727204156.8995714, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 36383, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1727204156.8995714, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 16380 1727204160.67355: no more pending results, returning what we have 16380 1727204160.67359: results queue empty 16380 1727204160.67360: checking for any_errors_fatal 16380 1727204160.67362: done checking for any_errors_fatal 16380 1727204160.67363: checking for max_fail_percentage 16380 1727204160.67365: done checking for max_fail_percentage 16380 1727204160.67366: checking to see if all hosts have failed and the running result is not ok 16380 1727204160.67367: done checking to see if all hosts have failed 16380 1727204160.67368: getting the remaining hosts for this loop 16380 1727204160.67370: done getting the remaining hosts for this loop 16380 1727204160.67375: getting the next task for host managed-node2 16380 1727204160.67384: done getting next task for host managed-node2 16380 1727204160.67388: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 16380 1727204160.67594: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204160.67599: getting variables 16380 1727204160.67601: in VariableManager get_vars() 16380 1727204160.67630: Calling all_inventory to load vars for managed-node2 16380 1727204160.67633: Calling groups_inventory to load vars for managed-node2 16380 1727204160.67637: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204160.67649: Calling all_plugins_play to load vars for managed-node2 16380 1727204160.67653: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204160.67657: Calling groups_plugins_play to load vars for managed-node2 16380 1727204160.72766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204160.78777: done with get_vars() 16380 1727204160.79030: done getting variables 16380 1727204160.79112: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204160.79462: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.597) 0:00:21.901 ***** 16380 1727204160.79502: entering _queue_task() for managed-node2/assert 16380 1727204160.80303: worker is 1 (out of 1 available) 16380 1727204160.80317: exiting _queue_task() for managed-node2/assert 16380 1727204160.80331: done queuing things up, now waiting for results queue to drain 16380 1727204160.80333: waiting for pending results... 16380 1727204160.81113: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'LSR-TST-br31' 16380 1727204160.81175: in run() - task 12b410aa-8751-749c-b6eb-00000000022b 16380 1727204160.81276: variable 'ansible_search_path' from source: unknown 16380 1727204160.81286: variable 'ansible_search_path' from source: unknown 16380 1727204160.81341: calling self._execute() 16380 1727204160.81812: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204160.81816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204160.81819: variable 'omit' from source: magic vars 16380 1727204160.82650: variable 'ansible_distribution_major_version' from source: facts 16380 1727204160.82670: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204160.82898: variable 'omit' from source: magic vars 16380 1727204160.82902: variable 'omit' from source: magic vars 16380 1727204160.83127: variable 'interface' from source: set_fact 16380 1727204160.83337: variable 'omit' from source: magic vars 16380 1727204160.83341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204160.83415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204160.83553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204160.83580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204160.83600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204160.83641: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204160.83667: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204160.83675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204160.83989: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204160.84006: Set connection var ansible_shell_executable to /bin/sh 16380 1727204160.84074: Set connection var ansible_connection to ssh 16380 1727204160.84087: Set connection var ansible_shell_type to sh 16380 1727204160.84104: Set connection var ansible_pipelining to False 16380 1727204160.84120: Set connection var ansible_timeout to 10 16380 1727204160.84194: variable 'ansible_shell_executable' from source: unknown 16380 1727204160.84394: variable 'ansible_connection' from source: unknown 16380 1727204160.84398: variable 'ansible_module_compression' from source: unknown 16380 1727204160.84400: variable 'ansible_shell_type' from source: unknown 16380 1727204160.84403: variable 'ansible_shell_executable' from source: unknown 16380 1727204160.84405: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204160.84407: variable 'ansible_pipelining' from source: unknown 16380 1727204160.84411: variable 'ansible_timeout' from source: unknown 16380 1727204160.84413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204160.84740: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204160.84769: variable 'omit' from source: magic vars 16380 1727204160.84804: starting attempt loop 16380 1727204160.84829: running the handler 16380 1727204160.85302: variable 'interface_stat' from source: set_fact 16380 1727204160.85337: Evaluated conditional (interface_stat.stat.exists): True 16380 1727204160.85394: handler run complete 16380 1727204160.85428: attempt loop complete, returning result 16380 1727204160.85475: _execute() done 16380 1727204160.85485: dumping result to json 16380 1727204160.85496: done dumping result, returning 16380 1727204160.85530: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'LSR-TST-br31' [12b410aa-8751-749c-b6eb-00000000022b] 16380 1727204160.85683: sending task result for task 12b410aa-8751-749c-b6eb-00000000022b ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204160.85825: no more pending results, returning what we have 16380 1727204160.85828: results queue empty 16380 1727204160.85829: checking for any_errors_fatal 16380 1727204160.85841: done checking for any_errors_fatal 16380 1727204160.85842: checking for max_fail_percentage 16380 1727204160.85844: done checking for max_fail_percentage 16380 1727204160.85845: checking to see if all hosts have failed and the running result is not ok 16380 1727204160.85846: done checking to see if all hosts have failed 16380 1727204160.85847: getting the remaining hosts for this loop 16380 1727204160.85849: done getting the remaining hosts for this loop 16380 1727204160.85854: getting the next task for host managed-node2 16380 1727204160.85868: done getting next task for host managed-node2 16380 1727204160.85873: ^ task is: TASK: meta (flush_handlers) 16380 1727204160.85876: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204160.85881: getting variables 16380 1727204160.85883: in VariableManager get_vars() 16380 1727204160.85920: Calling all_inventory to load vars for managed-node2 16380 1727204160.85924: Calling groups_inventory to load vars for managed-node2 16380 1727204160.85929: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204160.85944: Calling all_plugins_play to load vars for managed-node2 16380 1727204160.85948: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204160.85952: Calling groups_plugins_play to load vars for managed-node2 16380 1727204160.86718: done sending task result for task 12b410aa-8751-749c-b6eb-00000000022b 16380 1727204160.86723: WORKER PROCESS EXITING 16380 1727204160.90934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204160.97218: done with get_vars() 16380 1727204160.97254: done getting variables 16380 1727204160.97551: in VariableManager get_vars() 16380 1727204160.97565: Calling all_inventory to load vars for managed-node2 16380 1727204160.97568: Calling groups_inventory to load vars for managed-node2 16380 1727204160.97571: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204160.97578: Calling all_plugins_play to load vars for managed-node2 16380 1727204160.97581: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204160.97585: Calling groups_plugins_play to load vars for managed-node2 16380 1727204161.00171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204161.03160: done with get_vars() 16380 1727204161.03217: done queuing things up, now waiting for results queue to drain 16380 1727204161.03220: results queue empty 16380 1727204161.03221: checking for any_errors_fatal 16380 1727204161.03225: done checking for any_errors_fatal 16380 1727204161.03226: checking for max_fail_percentage 16380 1727204161.03228: done checking for max_fail_percentage 16380 1727204161.03229: checking to see if all hosts have failed and the running result is not ok 16380 1727204161.03230: done checking to see if all hosts have failed 16380 1727204161.03240: getting the remaining hosts for this loop 16380 1727204161.03242: done getting the remaining hosts for this loop 16380 1727204161.03247: getting the next task for host managed-node2 16380 1727204161.03252: done getting next task for host managed-node2 16380 1727204161.03255: ^ task is: TASK: meta (flush_handlers) 16380 1727204161.03257: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204161.03260: getting variables 16380 1727204161.03261: in VariableManager get_vars() 16380 1727204161.03274: Calling all_inventory to load vars for managed-node2 16380 1727204161.03277: Calling groups_inventory to load vars for managed-node2 16380 1727204161.03280: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204161.03287: Calling all_plugins_play to load vars for managed-node2 16380 1727204161.03293: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204161.03297: Calling groups_plugins_play to load vars for managed-node2 16380 1727204161.05379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204161.10028: done with get_vars() 16380 1727204161.10070: done getting variables 16380 1727204161.10344: in VariableManager get_vars() 16380 1727204161.10357: Calling all_inventory to load vars for managed-node2 16380 1727204161.10360: Calling groups_inventory to load vars for managed-node2 16380 1727204161.10363: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204161.10369: Calling all_plugins_play to load vars for managed-node2 16380 1727204161.10373: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204161.10377: Calling groups_plugins_play to load vars for managed-node2 16380 1727204161.14355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204161.20379: done with get_vars() 16380 1727204161.20423: done queuing things up, now waiting for results queue to drain 16380 1727204161.20426: results queue empty 16380 1727204161.20427: checking for any_errors_fatal 16380 1727204161.20429: done checking for any_errors_fatal 16380 1727204161.20430: checking for max_fail_percentage 16380 1727204161.20431: done checking for max_fail_percentage 16380 1727204161.20432: checking to see if all hosts have failed and the running result is not ok 16380 1727204161.20433: done checking to see if all hosts have failed 16380 1727204161.20434: getting the remaining hosts for this loop 16380 1727204161.20436: done getting the remaining hosts for this loop 16380 1727204161.20439: getting the next task for host managed-node2 16380 1727204161.20443: done getting next task for host managed-node2 16380 1727204161.20444: ^ task is: None 16380 1727204161.20446: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204161.20447: done queuing things up, now waiting for results queue to drain 16380 1727204161.20449: results queue empty 16380 1727204161.20450: checking for any_errors_fatal 16380 1727204161.20451: done checking for any_errors_fatal 16380 1727204161.20451: checking for max_fail_percentage 16380 1727204161.20453: done checking for max_fail_percentage 16380 1727204161.20453: checking to see if all hosts have failed and the running result is not ok 16380 1727204161.20454: done checking to see if all hosts have failed 16380 1727204161.20456: getting the next task for host managed-node2 16380 1727204161.20458: done getting next task for host managed-node2 16380 1727204161.20459: ^ task is: None 16380 1727204161.20461: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204161.20708: in VariableManager get_vars() 16380 1727204161.20729: done with get_vars() 16380 1727204161.20735: in VariableManager get_vars() 16380 1727204161.20747: done with get_vars() 16380 1727204161.20753: variable 'omit' from source: magic vars 16380 1727204161.20881: variable 'task' from source: play vars 16380 1727204161.21125: in VariableManager get_vars() 16380 1727204161.21139: done with get_vars() 16380 1727204161.21163: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 16380 1727204161.21584: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204161.21796: getting the remaining hosts for this loop 16380 1727204161.21798: done getting the remaining hosts for this loop 16380 1727204161.21801: getting the next task for host managed-node2 16380 1727204161.21805: done getting next task for host managed-node2 16380 1727204161.21807: ^ task is: TASK: Gathering Facts 16380 1727204161.21809: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204161.21811: getting variables 16380 1727204161.21812: in VariableManager get_vars() 16380 1727204161.21822: Calling all_inventory to load vars for managed-node2 16380 1727204161.21825: Calling groups_inventory to load vars for managed-node2 16380 1727204161.21828: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204161.21835: Calling all_plugins_play to load vars for managed-node2 16380 1727204161.21838: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204161.21842: Calling groups_plugins_play to load vars for managed-node2 16380 1727204161.25297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204161.29239: done with get_vars() 16380 1727204161.29278: done getting variables 16380 1727204161.29343: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:01 -0400 (0:00:00.498) 0:00:22.400 ***** 16380 1727204161.29375: entering _queue_task() for managed-node2/gather_facts 16380 1727204161.29749: worker is 1 (out of 1 available) 16380 1727204161.29763: exiting _queue_task() for managed-node2/gather_facts 16380 1727204161.29782: done queuing things up, now waiting for results queue to drain 16380 1727204161.29784: waiting for pending results... 16380 1727204161.29977: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204161.30091: in run() - task 12b410aa-8751-749c-b6eb-00000000024e 16380 1727204161.30115: variable 'ansible_search_path' from source: unknown 16380 1727204161.30295: calling self._execute() 16380 1727204161.30299: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204161.30302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204161.30304: variable 'omit' from source: magic vars 16380 1727204161.30710: variable 'ansible_distribution_major_version' from source: facts 16380 1727204161.30729: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204161.30742: variable 'omit' from source: magic vars 16380 1727204161.30778: variable 'omit' from source: magic vars 16380 1727204161.30829: variable 'omit' from source: magic vars 16380 1727204161.30882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204161.30931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204161.30959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204161.30987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204161.31007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204161.31046: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204161.31055: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204161.31064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204161.31193: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204161.31212: Set connection var ansible_shell_executable to /bin/sh 16380 1727204161.31225: Set connection var ansible_connection to ssh 16380 1727204161.31236: Set connection var ansible_shell_type to sh 16380 1727204161.31246: Set connection var ansible_pipelining to False 16380 1727204161.31261: Set connection var ansible_timeout to 10 16380 1727204161.31292: variable 'ansible_shell_executable' from source: unknown 16380 1727204161.31301: variable 'ansible_connection' from source: unknown 16380 1727204161.31313: variable 'ansible_module_compression' from source: unknown 16380 1727204161.31322: variable 'ansible_shell_type' from source: unknown 16380 1727204161.31330: variable 'ansible_shell_executable' from source: unknown 16380 1727204161.31338: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204161.31347: variable 'ansible_pipelining' from source: unknown 16380 1727204161.31403: variable 'ansible_timeout' from source: unknown 16380 1727204161.31406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204161.31661: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204161.31680: variable 'omit' from source: magic vars 16380 1727204161.31744: starting attempt loop 16380 1727204161.31748: running the handler 16380 1727204161.31751: variable 'ansible_facts' from source: unknown 16380 1727204161.31754: _low_level_execute_command(): starting 16380 1727204161.31767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204161.32613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204161.32637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204161.32655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204161.32736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204161.34611: stdout chunk (state=3): >>>/root <<< 16380 1727204161.35225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204161.35230: stdout chunk (state=3): >>><<< 16380 1727204161.35232: stderr chunk (state=3): >>><<< 16380 1727204161.35236: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204161.35238: _low_level_execute_command(): starting 16380 1727204161.35241: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454 `" && echo ansible-tmp-1727204161.3511631-18179-144053197417454="` echo /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454 `" ) && sleep 0' 16380 1727204161.36417: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204161.36435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204161.36454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204161.36521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204161.36524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204161.36807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204161.36896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204161.38997: stdout chunk (state=3): >>>ansible-tmp-1727204161.3511631-18179-144053197417454=/root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454 <<< 16380 1727204161.39115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204161.39202: stderr chunk (state=3): >>><<< 16380 1727204161.39212: stdout chunk (state=3): >>><<< 16380 1727204161.39292: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204161.3511631-18179-144053197417454=/root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204161.39334: variable 'ansible_module_compression' from source: unknown 16380 1727204161.39596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204161.39625: variable 'ansible_facts' from source: unknown 16380 1727204161.40062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py 16380 1727204161.40480: Sending initial data 16380 1727204161.40494: Sent initial data (154 bytes) 16380 1727204161.41971: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204161.41998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204161.42084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204161.43865: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204161.43953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204161.43984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py" <<< 16380 1727204161.44046: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp4mx91tmc /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py <<< 16380 1727204161.44065: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp4mx91tmc" to remote "/root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py" <<< 16380 1727204161.48255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204161.48400: stderr chunk (state=3): >>><<< 16380 1727204161.48616: stdout chunk (state=3): >>><<< 16380 1727204161.48619: done transferring module to remote 16380 1727204161.48622: _low_level_execute_command(): starting 16380 1727204161.48624: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/ /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py && sleep 0' 16380 1727204161.49941: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204161.50051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204161.50162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204161.50240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204161.52308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204161.52323: stdout chunk (state=3): >>><<< 16380 1727204161.52510: stderr chunk (state=3): >>><<< 16380 1727204161.52515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204161.52521: _low_level_execute_command(): starting 16380 1727204161.52524: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/AnsiballZ_setup.py && sleep 0' 16380 1727204161.53603: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204161.53607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204161.53668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204161.53745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204161.53828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204161.53832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204161.53882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204161.54069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204162.23686: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb":<<< 16380 1727204162.23731: stdout chunk (state=3): >>> true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2846, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 871, "free": 2846}, "nocache": {"free": 3474, "used": 243}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 665, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155976192, "block_size": 4096, "block_total": 64479564, "block_available": 61317377, "block_used": 3162187, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.43115234375, "5m": 0.52099609375, "15m": 0.33837890625}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "02", "epoch": "1727204162", "epoch_int": "1727204162", "date": "2024-09-24", "time": "14:56:02", "iso8601_micro": "2024-09-24T18:56:02.198695Z", "iso8601": "2024-09-24T18:56:02Z", "iso8601_basic": "20240924T145602198695", "iso8601_basic_short": "20240924T145602", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:07:3c:a4:a3:3f", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204162.25959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204162.26051: stderr chunk (state=3): >>><<< 16380 1727204162.26055: stdout chunk (state=3): >>><<< 16380 1727204162.26092: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2846, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 871, "free": 2846}, "nocache": {"free": 3474, "used": 243}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 665, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155976192, "block_size": 4096, "block_total": 64479564, "block_available": 61317377, "block_used": 3162187, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_loadavg": {"1m": 0.43115234375, "5m": 0.52099609375, "15m": 0.33837890625}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "02", "epoch": "1727204162", "epoch_int": "1727204162", "date": "2024-09-24", "time": "14:56:02", "iso8601_micro": "2024-09-24T18:56:02.198695Z", "iso8601": "2024-09-24T18:56:02Z", "iso8601_basic": "20240924T145602198695", "iso8601_basic_short": "20240924T145602", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:07:3c:a4:a3:3f", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204162.26411: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204162.26432: _low_level_execute_command(): starting 16380 1727204162.26437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204161.3511631-18179-144053197417454/ > /dev/null 2>&1 && sleep 0' 16380 1727204162.27185: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204162.27191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204162.27250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204162.27287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204162.29302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204162.29364: stderr chunk (state=3): >>><<< 16380 1727204162.29367: stdout chunk (state=3): >>><<< 16380 1727204162.29390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204162.29395: handler run complete 16380 1727204162.29540: variable 'ansible_facts' from source: unknown 16380 1727204162.29683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.30072: variable 'ansible_facts' from source: unknown 16380 1727204162.30164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.30272: attempt loop complete, returning result 16380 1727204162.30275: _execute() done 16380 1727204162.30280: dumping result to json 16380 1727204162.30306: done dumping result, returning 16380 1727204162.30315: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-00000000024e] 16380 1727204162.30322: sending task result for task 12b410aa-8751-749c-b6eb-00000000024e 16380 1727204162.30661: done sending task result for task 12b410aa-8751-749c-b6eb-00000000024e 16380 1727204162.30664: WORKER PROCESS EXITING ok: [managed-node2] 16380 1727204162.31043: no more pending results, returning what we have 16380 1727204162.31045: results queue empty 16380 1727204162.31046: checking for any_errors_fatal 16380 1727204162.31047: done checking for any_errors_fatal 16380 1727204162.31048: checking for max_fail_percentage 16380 1727204162.31049: done checking for max_fail_percentage 16380 1727204162.31050: checking to see if all hosts have failed and the running result is not ok 16380 1727204162.31050: done checking to see if all hosts have failed 16380 1727204162.31051: getting the remaining hosts for this loop 16380 1727204162.31052: done getting the remaining hosts for this loop 16380 1727204162.31055: getting the next task for host managed-node2 16380 1727204162.31059: done getting next task for host managed-node2 16380 1727204162.31061: ^ task is: TASK: meta (flush_handlers) 16380 1727204162.31062: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204162.31065: getting variables 16380 1727204162.31066: in VariableManager get_vars() 16380 1727204162.31085: Calling all_inventory to load vars for managed-node2 16380 1727204162.31087: Calling groups_inventory to load vars for managed-node2 16380 1727204162.31092: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.31102: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.31113: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.31118: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.39491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.42364: done with get_vars() 16380 1727204162.42413: done getting variables 16380 1727204162.42486: in VariableManager get_vars() 16380 1727204162.42501: Calling all_inventory to load vars for managed-node2 16380 1727204162.42504: Calling groups_inventory to load vars for managed-node2 16380 1727204162.42507: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.42516: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.42519: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.42523: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.44505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.47631: done with get_vars() 16380 1727204162.47673: done queuing things up, now waiting for results queue to drain 16380 1727204162.47676: results queue empty 16380 1727204162.47677: checking for any_errors_fatal 16380 1727204162.47682: done checking for any_errors_fatal 16380 1727204162.47683: checking for max_fail_percentage 16380 1727204162.47685: done checking for max_fail_percentage 16380 1727204162.47686: checking to see if all hosts have failed and the running result is not ok 16380 1727204162.47687: done checking to see if all hosts have failed 16380 1727204162.47688: getting the remaining hosts for this loop 16380 1727204162.47698: done getting the remaining hosts for this loop 16380 1727204162.47702: getting the next task for host managed-node2 16380 1727204162.47707: done getting next task for host managed-node2 16380 1727204162.47712: ^ task is: TASK: Include the task '{{ task }}' 16380 1727204162.47714: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204162.47717: getting variables 16380 1727204162.47718: in VariableManager get_vars() 16380 1727204162.47730: Calling all_inventory to load vars for managed-node2 16380 1727204162.47733: Calling groups_inventory to load vars for managed-node2 16380 1727204162.47736: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.47743: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.47746: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.47750: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.50179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.54614: done with get_vars() 16380 1727204162.54662: done getting variables 16380 1727204162.54853: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:02 -0400 (0:00:01.255) 0:00:23.655 ***** 16380 1727204162.54885: entering _queue_task() for managed-node2/include_tasks 16380 1727204162.55275: worker is 1 (out of 1 available) 16380 1727204162.55288: exiting _queue_task() for managed-node2/include_tasks 16380 1727204162.55418: done queuing things up, now waiting for results queue to drain 16380 1727204162.55421: waiting for pending results... 16380 1727204162.55718: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_present.yml' 16380 1727204162.55815: in run() - task 12b410aa-8751-749c-b6eb-000000000031 16380 1727204162.55819: variable 'ansible_search_path' from source: unknown 16380 1727204162.55851: calling self._execute() 16380 1727204162.56031: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204162.56035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204162.56038: variable 'omit' from source: magic vars 16380 1727204162.56496: variable 'ansible_distribution_major_version' from source: facts 16380 1727204162.56519: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204162.56531: variable 'task' from source: play vars 16380 1727204162.56625: variable 'task' from source: play vars 16380 1727204162.56640: _execute() done 16380 1727204162.56649: dumping result to json 16380 1727204162.56659: done dumping result, returning 16380 1727204162.56673: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_present.yml' [12b410aa-8751-749c-b6eb-000000000031] 16380 1727204162.56797: sending task result for task 12b410aa-8751-749c-b6eb-000000000031 16380 1727204162.56878: done sending task result for task 12b410aa-8751-749c-b6eb-000000000031 16380 1727204162.56881: WORKER PROCESS EXITING 16380 1727204162.56925: no more pending results, returning what we have 16380 1727204162.56934: in VariableManager get_vars() 16380 1727204162.56974: Calling all_inventory to load vars for managed-node2 16380 1727204162.56978: Calling groups_inventory to load vars for managed-node2 16380 1727204162.56983: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.57001: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.57005: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.57012: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.59683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.62796: done with get_vars() 16380 1727204162.62839: variable 'ansible_search_path' from source: unknown 16380 1727204162.62858: we have included files to process 16380 1727204162.62859: generating all_blocks data 16380 1727204162.62861: done generating all_blocks data 16380 1727204162.62862: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16380 1727204162.62863: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16380 1727204162.62866: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 16380 1727204162.63148: in VariableManager get_vars() 16380 1727204162.63170: done with get_vars() 16380 1727204162.63505: done processing included file 16380 1727204162.63508: iterating over new_blocks loaded from include file 16380 1727204162.63512: in VariableManager get_vars() 16380 1727204162.63527: done with get_vars() 16380 1727204162.63533: filtering new block on tags 16380 1727204162.63561: done filtering new block on tags 16380 1727204162.63565: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 16380 1727204162.63571: extending task lists for all hosts with included blocks 16380 1727204162.63618: done extending task lists 16380 1727204162.63619: done processing included files 16380 1727204162.63620: results queue empty 16380 1727204162.63621: checking for any_errors_fatal 16380 1727204162.63624: done checking for any_errors_fatal 16380 1727204162.63624: checking for max_fail_percentage 16380 1727204162.63626: done checking for max_fail_percentage 16380 1727204162.63627: checking to see if all hosts have failed and the running result is not ok 16380 1727204162.63628: done checking to see if all hosts have failed 16380 1727204162.63629: getting the remaining hosts for this loop 16380 1727204162.63631: done getting the remaining hosts for this loop 16380 1727204162.63634: getting the next task for host managed-node2 16380 1727204162.63643: done getting next task for host managed-node2 16380 1727204162.63647: ^ task is: TASK: Include the task 'get_profile_stat.yml' 16380 1727204162.63649: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204162.63652: getting variables 16380 1727204162.63654: in VariableManager get_vars() 16380 1727204162.63664: Calling all_inventory to load vars for managed-node2 16380 1727204162.63667: Calling groups_inventory to load vars for managed-node2 16380 1727204162.63670: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.63677: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.63680: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.63683: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.66923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.72073: done with get_vars() 16380 1727204162.72115: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.173) 0:00:23.828 ***** 16380 1727204162.72221: entering _queue_task() for managed-node2/include_tasks 16380 1727204162.72614: worker is 1 (out of 1 available) 16380 1727204162.72626: exiting _queue_task() for managed-node2/include_tasks 16380 1727204162.72639: done queuing things up, now waiting for results queue to drain 16380 1727204162.72641: waiting for pending results... 16380 1727204162.72954: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 16380 1727204162.73295: in run() - task 12b410aa-8751-749c-b6eb-00000000025f 16380 1727204162.73322: variable 'ansible_search_path' from source: unknown 16380 1727204162.73330: variable 'ansible_search_path' from source: unknown 16380 1727204162.73375: calling self._execute() 16380 1727204162.73495: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204162.73519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204162.73617: variable 'omit' from source: magic vars 16380 1727204162.74028: variable 'ansible_distribution_major_version' from source: facts 16380 1727204162.74048: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204162.74062: _execute() done 16380 1727204162.74166: dumping result to json 16380 1727204162.74170: done dumping result, returning 16380 1727204162.74173: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-749c-b6eb-00000000025f] 16380 1727204162.74175: sending task result for task 12b410aa-8751-749c-b6eb-00000000025f 16380 1727204162.74254: done sending task result for task 12b410aa-8751-749c-b6eb-00000000025f 16380 1727204162.74257: WORKER PROCESS EXITING 16380 1727204162.74300: no more pending results, returning what we have 16380 1727204162.74305: in VariableManager get_vars() 16380 1727204162.74350: Calling all_inventory to load vars for managed-node2 16380 1727204162.74354: Calling groups_inventory to load vars for managed-node2 16380 1727204162.74359: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.74375: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.74379: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.74383: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.76895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.81446: done with get_vars() 16380 1727204162.81492: variable 'ansible_search_path' from source: unknown 16380 1727204162.81494: variable 'ansible_search_path' from source: unknown 16380 1727204162.81507: variable 'task' from source: play vars 16380 1727204162.81639: variable 'task' from source: play vars 16380 1727204162.81683: we have included files to process 16380 1727204162.81685: generating all_blocks data 16380 1727204162.81687: done generating all_blocks data 16380 1727204162.81692: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16380 1727204162.81693: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16380 1727204162.81696: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16380 1727204162.83092: done processing included file 16380 1727204162.83095: iterating over new_blocks loaded from include file 16380 1727204162.83097: in VariableManager get_vars() 16380 1727204162.83118: done with get_vars() 16380 1727204162.83120: filtering new block on tags 16380 1727204162.83153: done filtering new block on tags 16380 1727204162.83157: in VariableManager get_vars() 16380 1727204162.83171: done with get_vars() 16380 1727204162.83173: filtering new block on tags 16380 1727204162.83202: done filtering new block on tags 16380 1727204162.83205: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 16380 1727204162.83217: extending task lists for all hosts with included blocks 16380 1727204162.83449: done extending task lists 16380 1727204162.83451: done processing included files 16380 1727204162.83452: results queue empty 16380 1727204162.83453: checking for any_errors_fatal 16380 1727204162.83457: done checking for any_errors_fatal 16380 1727204162.83458: checking for max_fail_percentage 16380 1727204162.83459: done checking for max_fail_percentage 16380 1727204162.83460: checking to see if all hosts have failed and the running result is not ok 16380 1727204162.83461: done checking to see if all hosts have failed 16380 1727204162.83462: getting the remaining hosts for this loop 16380 1727204162.83464: done getting the remaining hosts for this loop 16380 1727204162.83467: getting the next task for host managed-node2 16380 1727204162.83471: done getting next task for host managed-node2 16380 1727204162.83474: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 16380 1727204162.83477: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204162.83480: getting variables 16380 1727204162.83481: in VariableManager get_vars() 16380 1727204162.83593: Calling all_inventory to load vars for managed-node2 16380 1727204162.83596: Calling groups_inventory to load vars for managed-node2 16380 1727204162.83599: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.83606: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.83612: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.83616: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.85854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204162.92094: done with get_vars() 16380 1727204162.92143: done getting variables 16380 1727204162.92203: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.200) 0:00:24.029 ***** 16380 1727204162.92245: entering _queue_task() for managed-node2/set_fact 16380 1727204162.92739: worker is 1 (out of 1 available) 16380 1727204162.92753: exiting _queue_task() for managed-node2/set_fact 16380 1727204162.92766: done queuing things up, now waiting for results queue to drain 16380 1727204162.92768: waiting for pending results... 16380 1727204162.93012: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 16380 1727204162.93296: in run() - task 12b410aa-8751-749c-b6eb-00000000026c 16380 1727204162.93301: variable 'ansible_search_path' from source: unknown 16380 1727204162.93303: variable 'ansible_search_path' from source: unknown 16380 1727204162.93306: calling self._execute() 16380 1727204162.93321: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204162.93334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204162.93351: variable 'omit' from source: magic vars 16380 1727204162.93829: variable 'ansible_distribution_major_version' from source: facts 16380 1727204162.93850: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204162.93870: variable 'omit' from source: magic vars 16380 1727204162.93942: variable 'omit' from source: magic vars 16380 1727204162.94077: variable 'omit' from source: magic vars 16380 1727204162.94081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204162.94108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204162.94139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204162.94167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204162.94192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204162.94234: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204162.94245: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204162.94255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204162.94395: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204162.94417: Set connection var ansible_shell_executable to /bin/sh 16380 1727204162.94432: Set connection var ansible_connection to ssh 16380 1727204162.94445: Set connection var ansible_shell_type to sh 16380 1727204162.94457: Set connection var ansible_pipelining to False 16380 1727204162.94473: Set connection var ansible_timeout to 10 16380 1727204162.94518: variable 'ansible_shell_executable' from source: unknown 16380 1727204162.94595: variable 'ansible_connection' from source: unknown 16380 1727204162.94599: variable 'ansible_module_compression' from source: unknown 16380 1727204162.94602: variable 'ansible_shell_type' from source: unknown 16380 1727204162.94604: variable 'ansible_shell_executable' from source: unknown 16380 1727204162.94607: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204162.94612: variable 'ansible_pipelining' from source: unknown 16380 1727204162.94615: variable 'ansible_timeout' from source: unknown 16380 1727204162.94618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204162.94762: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204162.94781: variable 'omit' from source: magic vars 16380 1727204162.94795: starting attempt loop 16380 1727204162.94803: running the handler 16380 1727204162.94825: handler run complete 16380 1727204162.94846: attempt loop complete, returning result 16380 1727204162.94855: _execute() done 16380 1727204162.94863: dumping result to json 16380 1727204162.94872: done dumping result, returning 16380 1727204162.94894: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-749c-b6eb-00000000026c] 16380 1727204162.94951: sending task result for task 12b410aa-8751-749c-b6eb-00000000026c 16380 1727204162.95032: done sending task result for task 12b410aa-8751-749c-b6eb-00000000026c 16380 1727204162.95036: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 16380 1727204162.95124: no more pending results, returning what we have 16380 1727204162.95129: results queue empty 16380 1727204162.95130: checking for any_errors_fatal 16380 1727204162.95132: done checking for any_errors_fatal 16380 1727204162.95133: checking for max_fail_percentage 16380 1727204162.95134: done checking for max_fail_percentage 16380 1727204162.95135: checking to see if all hosts have failed and the running result is not ok 16380 1727204162.95136: done checking to see if all hosts have failed 16380 1727204162.95138: getting the remaining hosts for this loop 16380 1727204162.95139: done getting the remaining hosts for this loop 16380 1727204162.95146: getting the next task for host managed-node2 16380 1727204162.95155: done getting next task for host managed-node2 16380 1727204162.95159: ^ task is: TASK: Stat profile file 16380 1727204162.95165: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204162.95170: getting variables 16380 1727204162.95173: in VariableManager get_vars() 16380 1727204162.95212: Calling all_inventory to load vars for managed-node2 16380 1727204162.95217: Calling groups_inventory to load vars for managed-node2 16380 1727204162.95221: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204162.95236: Calling all_plugins_play to load vars for managed-node2 16380 1727204162.95240: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204162.95244: Calling groups_plugins_play to load vars for managed-node2 16380 1727204162.97875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204163.00898: done with get_vars() 16380 1727204163.00947: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.088) 0:00:24.117 ***** 16380 1727204163.01073: entering _queue_task() for managed-node2/stat 16380 1727204163.01469: worker is 1 (out of 1 available) 16380 1727204163.01484: exiting _queue_task() for managed-node2/stat 16380 1727204163.01699: done queuing things up, now waiting for results queue to drain 16380 1727204163.01702: waiting for pending results... 16380 1727204163.01907: running TaskExecutor() for managed-node2/TASK: Stat profile file 16380 1727204163.01936: in run() - task 12b410aa-8751-749c-b6eb-00000000026d 16380 1727204163.01961: variable 'ansible_search_path' from source: unknown 16380 1727204163.01969: variable 'ansible_search_path' from source: unknown 16380 1727204163.02020: calling self._execute() 16380 1727204163.02145: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.02160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.02176: variable 'omit' from source: magic vars 16380 1727204163.02663: variable 'ansible_distribution_major_version' from source: facts 16380 1727204163.02690: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204163.02798: variable 'omit' from source: magic vars 16380 1727204163.02801: variable 'omit' from source: magic vars 16380 1727204163.02918: variable 'profile' from source: play vars 16380 1727204163.02930: variable 'interface' from source: set_fact 16380 1727204163.03021: variable 'interface' from source: set_fact 16380 1727204163.03049: variable 'omit' from source: magic vars 16380 1727204163.03105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204163.03164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204163.03198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204163.03237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204163.03257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204163.03300: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204163.03313: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.03323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.03456: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204163.03470: Set connection var ansible_shell_executable to /bin/sh 16380 1727204163.03481: Set connection var ansible_connection to ssh 16380 1727204163.03562: Set connection var ansible_shell_type to sh 16380 1727204163.03566: Set connection var ansible_pipelining to False 16380 1727204163.03569: Set connection var ansible_timeout to 10 16380 1727204163.03571: variable 'ansible_shell_executable' from source: unknown 16380 1727204163.03573: variable 'ansible_connection' from source: unknown 16380 1727204163.03576: variable 'ansible_module_compression' from source: unknown 16380 1727204163.03578: variable 'ansible_shell_type' from source: unknown 16380 1727204163.03580: variable 'ansible_shell_executable' from source: unknown 16380 1727204163.03584: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.03598: variable 'ansible_pipelining' from source: unknown 16380 1727204163.03607: variable 'ansible_timeout' from source: unknown 16380 1727204163.03619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.04197: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204163.04201: variable 'omit' from source: magic vars 16380 1727204163.04203: starting attempt loop 16380 1727204163.04206: running the handler 16380 1727204163.04208: _low_level_execute_command(): starting 16380 1727204163.04213: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204163.05616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.05713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.05813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.05887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.05962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.07933: stdout chunk (state=3): >>>/root <<< 16380 1727204163.08145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.08155: stdout chunk (state=3): >>><<< 16380 1727204163.08163: stderr chunk (state=3): >>><<< 16380 1727204163.08197: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.08272: _low_level_execute_command(): starting 16380 1727204163.08276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992 `" && echo ansible-tmp-1727204163.0819213-18220-20885018960992="` echo /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992 `" ) && sleep 0' 16380 1727204163.09543: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204163.09772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.21686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.21694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.21698: stdout chunk (state=3): >>>ansible-tmp-1727204163.0819213-18220-20885018960992=/root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992 <<< 16380 1727204163.21700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.21702: stderr chunk (state=3): >>><<< 16380 1727204163.21705: stdout chunk (state=3): >>><<< 16380 1727204163.21707: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204163.0819213-18220-20885018960992=/root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.21709: variable 'ansible_module_compression' from source: unknown 16380 1727204163.21711: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16380 1727204163.21713: variable 'ansible_facts' from source: unknown 16380 1727204163.21715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py 16380 1727204163.22116: Sending initial data 16380 1727204163.22120: Sent initial data (152 bytes) 16380 1727204163.23137: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.23174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.24943: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204163.25014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204163.25080: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpvydp8cx0 /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py <<< 16380 1727204163.25084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py" <<< 16380 1727204163.25147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpvydp8cx0" to remote "/root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py" <<< 16380 1727204163.26880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.27062: stderr chunk (state=3): >>><<< 16380 1727204163.27066: stdout chunk (state=3): >>><<< 16380 1727204163.27070: done transferring module to remote 16380 1727204163.27072: _low_level_execute_command(): starting 16380 1727204163.27088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/ /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py && sleep 0' 16380 1727204163.28313: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.28415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.28419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204163.28632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.28719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.30723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.30801: stderr chunk (state=3): >>><<< 16380 1727204163.30805: stdout chunk (state=3): >>><<< 16380 1727204163.30828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.30949: _low_level_execute_command(): starting 16380 1727204163.30953: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/AnsiballZ_stat.py && sleep 0' 16380 1727204163.31553: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204163.31569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.31585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204163.31616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204163.31649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.31760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204163.31782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.31798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.31894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.49801: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16380 1727204163.51339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204163.51361: stdout chunk (state=3): >>><<< 16380 1727204163.51388: stderr chunk (state=3): >>><<< 16380 1727204163.51575: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204163.51579: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204163.51581: _low_level_execute_command(): starting 16380 1727204163.51584: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204163.0819213-18220-20885018960992/ > /dev/null 2>&1 && sleep 0' 16380 1727204163.52205: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.52254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204163.52274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.52278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.52351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.54525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.54749: stderr chunk (state=3): >>><<< 16380 1727204163.54756: stdout chunk (state=3): >>><<< 16380 1727204163.54924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.54928: handler run complete 16380 1727204163.54931: attempt loop complete, returning result 16380 1727204163.54934: _execute() done 16380 1727204163.54936: dumping result to json 16380 1727204163.54938: done dumping result, returning 16380 1727204163.54941: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-749c-b6eb-00000000026d] 16380 1727204163.54947: sending task result for task 12b410aa-8751-749c-b6eb-00000000026d ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16380 1727204163.55164: no more pending results, returning what we have 16380 1727204163.55170: results queue empty 16380 1727204163.55171: checking for any_errors_fatal 16380 1727204163.55181: done checking for any_errors_fatal 16380 1727204163.55182: checking for max_fail_percentage 16380 1727204163.55183: done checking for max_fail_percentage 16380 1727204163.55184: checking to see if all hosts have failed and the running result is not ok 16380 1727204163.55185: done checking to see if all hosts have failed 16380 1727204163.55186: getting the remaining hosts for this loop 16380 1727204163.55192: done getting the remaining hosts for this loop 16380 1727204163.55197: getting the next task for host managed-node2 16380 1727204163.55207: done getting next task for host managed-node2 16380 1727204163.55212: ^ task is: TASK: Set NM profile exist flag based on the profile files 16380 1727204163.55217: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204163.55222: getting variables 16380 1727204163.55224: in VariableManager get_vars() 16380 1727204163.55260: Calling all_inventory to load vars for managed-node2 16380 1727204163.55263: Calling groups_inventory to load vars for managed-node2 16380 1727204163.55268: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204163.55283: Calling all_plugins_play to load vars for managed-node2 16380 1727204163.55287: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204163.55546: done sending task result for task 12b410aa-8751-749c-b6eb-00000000026d 16380 1727204163.55549: WORKER PROCESS EXITING 16380 1727204163.55555: Calling groups_plugins_play to load vars for managed-node2 16380 1727204163.60229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204163.64897: done with get_vars() 16380 1727204163.64945: done getting variables 16380 1727204163.65054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.640) 0:00:24.757 ***** 16380 1727204163.65094: entering _queue_task() for managed-node2/set_fact 16380 1727204163.65696: worker is 1 (out of 1 available) 16380 1727204163.65711: exiting _queue_task() for managed-node2/set_fact 16380 1727204163.65724: done queuing things up, now waiting for results queue to drain 16380 1727204163.65726: waiting for pending results... 16380 1727204163.66089: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 16380 1727204163.66266: in run() - task 12b410aa-8751-749c-b6eb-00000000026e 16380 1727204163.66298: variable 'ansible_search_path' from source: unknown 16380 1727204163.66305: variable 'ansible_search_path' from source: unknown 16380 1727204163.66356: calling self._execute() 16380 1727204163.66485: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.66502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.66525: variable 'omit' from source: magic vars 16380 1727204163.67298: variable 'ansible_distribution_major_version' from source: facts 16380 1727204163.67351: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204163.67536: variable 'profile_stat' from source: set_fact 16380 1727204163.67558: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204163.67566: when evaluation is False, skipping this task 16380 1727204163.67596: _execute() done 16380 1727204163.67600: dumping result to json 16380 1727204163.67607: done dumping result, returning 16380 1727204163.67610: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-749c-b6eb-00000000026e] 16380 1727204163.67618: sending task result for task 12b410aa-8751-749c-b6eb-00000000026e 16380 1727204163.67799: done sending task result for task 12b410aa-8751-749c-b6eb-00000000026e 16380 1727204163.67803: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204163.68036: no more pending results, returning what we have 16380 1727204163.68042: results queue empty 16380 1727204163.68043: checking for any_errors_fatal 16380 1727204163.68055: done checking for any_errors_fatal 16380 1727204163.68056: checking for max_fail_percentage 16380 1727204163.68058: done checking for max_fail_percentage 16380 1727204163.68059: checking to see if all hosts have failed and the running result is not ok 16380 1727204163.68060: done checking to see if all hosts have failed 16380 1727204163.68064: getting the remaining hosts for this loop 16380 1727204163.68066: done getting the remaining hosts for this loop 16380 1727204163.68071: getting the next task for host managed-node2 16380 1727204163.68083: done getting next task for host managed-node2 16380 1727204163.68087: ^ task is: TASK: Get NM profile info 16380 1727204163.68096: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204163.68104: getting variables 16380 1727204163.68107: in VariableManager get_vars() 16380 1727204163.68149: Calling all_inventory to load vars for managed-node2 16380 1727204163.68247: Calling groups_inventory to load vars for managed-node2 16380 1727204163.68262: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204163.68276: Calling all_plugins_play to load vars for managed-node2 16380 1727204163.68279: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204163.68283: Calling groups_plugins_play to load vars for managed-node2 16380 1727204163.71214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204163.74556: done with get_vars() 16380 1727204163.74637: done getting variables 16380 1727204163.74995: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.099) 0:00:24.857 ***** 16380 1727204163.75104: entering _queue_task() for managed-node2/shell 16380 1727204163.75106: Creating lock for shell 16380 1727204163.75717: worker is 1 (out of 1 available) 16380 1727204163.75731: exiting _queue_task() for managed-node2/shell 16380 1727204163.75743: done queuing things up, now waiting for results queue to drain 16380 1727204163.75745: waiting for pending results... 16380 1727204163.76000: running TaskExecutor() for managed-node2/TASK: Get NM profile info 16380 1727204163.76323: in run() - task 12b410aa-8751-749c-b6eb-00000000026f 16380 1727204163.76382: variable 'ansible_search_path' from source: unknown 16380 1727204163.76385: variable 'ansible_search_path' from source: unknown 16380 1727204163.76447: calling self._execute() 16380 1727204163.76750: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.76754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.76757: variable 'omit' from source: magic vars 16380 1727204163.77597: variable 'ansible_distribution_major_version' from source: facts 16380 1727204163.78080: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204163.78084: variable 'omit' from source: magic vars 16380 1727204163.78087: variable 'omit' from source: magic vars 16380 1727204163.78384: variable 'profile' from source: play vars 16380 1727204163.78388: variable 'interface' from source: set_fact 16380 1727204163.78482: variable 'interface' from source: set_fact 16380 1727204163.78515: variable 'omit' from source: magic vars 16380 1727204163.78568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204163.78632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204163.78653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204163.78677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204163.78692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204163.78753: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204163.78757: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.78764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.78911: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204163.78921: Set connection var ansible_shell_executable to /bin/sh 16380 1727204163.78931: Set connection var ansible_connection to ssh 16380 1727204163.78939: Set connection var ansible_shell_type to sh 16380 1727204163.78944: Set connection var ansible_pipelining to False 16380 1727204163.78954: Set connection var ansible_timeout to 10 16380 1727204163.78975: variable 'ansible_shell_executable' from source: unknown 16380 1727204163.78979: variable 'ansible_connection' from source: unknown 16380 1727204163.78981: variable 'ansible_module_compression' from source: unknown 16380 1727204163.78986: variable 'ansible_shell_type' from source: unknown 16380 1727204163.78990: variable 'ansible_shell_executable' from source: unknown 16380 1727204163.78995: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204163.79000: variable 'ansible_pipelining' from source: unknown 16380 1727204163.79003: variable 'ansible_timeout' from source: unknown 16380 1727204163.79008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204163.79138: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204163.79152: variable 'omit' from source: magic vars 16380 1727204163.79160: starting attempt loop 16380 1727204163.79163: running the handler 16380 1727204163.79174: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204163.79193: _low_level_execute_command(): starting 16380 1727204163.79201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204163.79751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204163.79755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.79759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204163.79764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.79820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204163.79826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.79829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.79871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.81797: stdout chunk (state=3): >>>/root <<< 16380 1727204163.82034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.82048: stdout chunk (state=3): >>><<< 16380 1727204163.82574: stderr chunk (state=3): >>><<< 16380 1727204163.82578: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.82581: _low_level_execute_command(): starting 16380 1727204163.82583: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960 `" && echo ansible-tmp-1727204163.8237479-18238-159826242810960="` echo /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960 `" ) && sleep 0' 16380 1727204163.84122: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.84394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.84414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.84521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.86818: stdout chunk (state=3): >>>ansible-tmp-1727204163.8237479-18238-159826242810960=/root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960 <<< 16380 1727204163.87103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.87136: stderr chunk (state=3): >>><<< 16380 1727204163.87141: stdout chunk (state=3): >>><<< 16380 1727204163.87295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204163.8237479-18238-159826242810960=/root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.87299: variable 'ansible_module_compression' from source: unknown 16380 1727204163.87302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16380 1727204163.87399: variable 'ansible_facts' from source: unknown 16380 1727204163.87701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py 16380 1727204163.87946: Sending initial data 16380 1727204163.87956: Sent initial data (156 bytes) 16380 1727204163.88461: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204163.88476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.88496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204163.88608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.88622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.88702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.90485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204163.90597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204163.90627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py" <<< 16380 1727204163.90649: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpxadnkko9 /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py <<< 16380 1727204163.90675: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 16380 1727204163.90709: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpxadnkko9" to remote "/root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py" <<< 16380 1727204163.92768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.92773: stdout chunk (state=3): >>><<< 16380 1727204163.92775: stderr chunk (state=3): >>><<< 16380 1727204163.92782: done transferring module to remote 16380 1727204163.92785: _low_level_execute_command(): starting 16380 1727204163.92796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/ /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py && sleep 0' 16380 1727204163.93924: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204163.93940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.94003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204163.94155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.94230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.94325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204163.96404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204163.96420: stdout chunk (state=3): >>><<< 16380 1727204163.96432: stderr chunk (state=3): >>><<< 16380 1727204163.96455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204163.96464: _low_level_execute_command(): starting 16380 1727204163.96474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/AnsiballZ_command.py && sleep 0' 16380 1727204163.97122: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204163.97126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.97139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204163.97231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204163.97237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204163.97240: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204163.97242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.97245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204163.97247: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204163.97249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16380 1727204163.97251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204163.97253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204163.97313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204163.97349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204163.97353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204163.97376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204163.97460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204164.17397: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:04.151736", "end": "2024-09-24 14:56:04.171293", "delta": "0:00:00.019557", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16380 1727204164.19011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204164.19029: stderr chunk (state=3): >>><<< 16380 1727204164.19038: stdout chunk (state=3): >>><<< 16380 1727204164.19061: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:04.151736", "end": "2024-09-24 14:56:04.171293", "delta": "0:00:00.019557", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204164.19119: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204164.19134: _low_level_execute_command(): starting 16380 1727204164.19145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204163.8237479-18238-159826242810960/ > /dev/null 2>&1 && sleep 0' 16380 1727204164.19782: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204164.19804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204164.19828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204164.19850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204164.19870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204164.19909: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204164.19923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204164.19937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204164.20024: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204164.20081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204164.20126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204164.22227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204164.22231: stdout chunk (state=3): >>><<< 16380 1727204164.22239: stderr chunk (state=3): >>><<< 16380 1727204164.22259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204164.22268: handler run complete 16380 1727204164.22303: Evaluated conditional (False): False 16380 1727204164.22319: attempt loop complete, returning result 16380 1727204164.22322: _execute() done 16380 1727204164.22325: dumping result to json 16380 1727204164.22497: done dumping result, returning 16380 1727204164.22500: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-749c-b6eb-00000000026f] 16380 1727204164.22503: sending task result for task 12b410aa-8751-749c-b6eb-00000000026f 16380 1727204164.22576: done sending task result for task 12b410aa-8751-749c-b6eb-00000000026f 16380 1727204164.22580: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.019557", "end": "2024-09-24 14:56:04.171293", "rc": 0, "start": "2024-09-24 14:56:04.151736" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 16380 1727204164.22692: no more pending results, returning what we have 16380 1727204164.22697: results queue empty 16380 1727204164.22698: checking for any_errors_fatal 16380 1727204164.22705: done checking for any_errors_fatal 16380 1727204164.22706: checking for max_fail_percentage 16380 1727204164.22708: done checking for max_fail_percentage 16380 1727204164.22712: checking to see if all hosts have failed and the running result is not ok 16380 1727204164.22713: done checking to see if all hosts have failed 16380 1727204164.22714: getting the remaining hosts for this loop 16380 1727204164.22716: done getting the remaining hosts for this loop 16380 1727204164.22839: getting the next task for host managed-node2 16380 1727204164.22848: done getting next task for host managed-node2 16380 1727204164.22852: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16380 1727204164.22857: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204164.22861: getting variables 16380 1727204164.22862: in VariableManager get_vars() 16380 1727204164.22895: Calling all_inventory to load vars for managed-node2 16380 1727204164.22898: Calling groups_inventory to load vars for managed-node2 16380 1727204164.22903: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204164.22918: Calling all_plugins_play to load vars for managed-node2 16380 1727204164.22922: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204164.22926: Calling groups_plugins_play to load vars for managed-node2 16380 1727204164.25601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204164.30766: done with get_vars() 16380 1727204164.30996: done getting variables 16380 1727204164.31075: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.561) 0:00:25.418 ***** 16380 1727204164.31223: entering _queue_task() for managed-node2/set_fact 16380 1727204164.32064: worker is 1 (out of 1 available) 16380 1727204164.32079: exiting _queue_task() for managed-node2/set_fact 16380 1727204164.32107: done queuing things up, now waiting for results queue to drain 16380 1727204164.32113: waiting for pending results... 16380 1727204164.32432: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16380 1727204164.32567: in run() - task 12b410aa-8751-749c-b6eb-000000000270 16380 1727204164.32588: variable 'ansible_search_path' from source: unknown 16380 1727204164.32593: variable 'ansible_search_path' from source: unknown 16380 1727204164.32635: calling self._execute() 16380 1727204164.32748: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.32757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.32768: variable 'omit' from source: magic vars 16380 1727204164.33263: variable 'ansible_distribution_major_version' from source: facts 16380 1727204164.33277: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204164.33467: variable 'nm_profile_exists' from source: set_fact 16380 1727204164.33486: Evaluated conditional (nm_profile_exists.rc == 0): True 16380 1727204164.33494: variable 'omit' from source: magic vars 16380 1727204164.33567: variable 'omit' from source: magic vars 16380 1727204164.33611: variable 'omit' from source: magic vars 16380 1727204164.33670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204164.33762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204164.33766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204164.33770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204164.33786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204164.33825: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204164.33829: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.33834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.34061: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204164.34064: Set connection var ansible_shell_executable to /bin/sh 16380 1727204164.34067: Set connection var ansible_connection to ssh 16380 1727204164.34070: Set connection var ansible_shell_type to sh 16380 1727204164.34072: Set connection var ansible_pipelining to False 16380 1727204164.34075: Set connection var ansible_timeout to 10 16380 1727204164.34077: variable 'ansible_shell_executable' from source: unknown 16380 1727204164.34080: variable 'ansible_connection' from source: unknown 16380 1727204164.34099: variable 'ansible_module_compression' from source: unknown 16380 1727204164.34102: variable 'ansible_shell_type' from source: unknown 16380 1727204164.34106: variable 'ansible_shell_executable' from source: unknown 16380 1727204164.34109: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.34111: variable 'ansible_pipelining' from source: unknown 16380 1727204164.34114: variable 'ansible_timeout' from source: unknown 16380 1727204164.34117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.34417: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204164.34421: variable 'omit' from source: magic vars 16380 1727204164.34423: starting attempt loop 16380 1727204164.34426: running the handler 16380 1727204164.34432: handler run complete 16380 1727204164.34463: attempt loop complete, returning result 16380 1727204164.34492: _execute() done 16380 1727204164.34495: dumping result to json 16380 1727204164.34524: done dumping result, returning 16380 1727204164.34528: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-749c-b6eb-000000000270] 16380 1727204164.34530: sending task result for task 12b410aa-8751-749c-b6eb-000000000270 16380 1727204164.35055: done sending task result for task 12b410aa-8751-749c-b6eb-000000000270 16380 1727204164.35058: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 16380 1727204164.35143: no more pending results, returning what we have 16380 1727204164.35146: results queue empty 16380 1727204164.35147: checking for any_errors_fatal 16380 1727204164.35154: done checking for any_errors_fatal 16380 1727204164.35155: checking for max_fail_percentage 16380 1727204164.35157: done checking for max_fail_percentage 16380 1727204164.35158: checking to see if all hosts have failed and the running result is not ok 16380 1727204164.35159: done checking to see if all hosts have failed 16380 1727204164.35160: getting the remaining hosts for this loop 16380 1727204164.35162: done getting the remaining hosts for this loop 16380 1727204164.35166: getting the next task for host managed-node2 16380 1727204164.35177: done getting next task for host managed-node2 16380 1727204164.35180: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 16380 1727204164.35187: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204164.35190: getting variables 16380 1727204164.35192: in VariableManager get_vars() 16380 1727204164.35228: Calling all_inventory to load vars for managed-node2 16380 1727204164.35232: Calling groups_inventory to load vars for managed-node2 16380 1727204164.35236: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204164.35248: Calling all_plugins_play to load vars for managed-node2 16380 1727204164.35251: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204164.35255: Calling groups_plugins_play to load vars for managed-node2 16380 1727204164.38444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204164.41742: done with get_vars() 16380 1727204164.41791: done getting variables 16380 1727204164.41858: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204164.42014: variable 'profile' from source: play vars 16380 1727204164.42018: variable 'interface' from source: set_fact 16380 1727204164.42089: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.109) 0:00:25.527 ***** 16380 1727204164.42139: entering _queue_task() for managed-node2/command 16380 1727204164.43025: worker is 1 (out of 1 available) 16380 1727204164.43037: exiting _queue_task() for managed-node2/command 16380 1727204164.43050: done queuing things up, now waiting for results queue to drain 16380 1727204164.43052: waiting for pending results... 16380 1727204164.43609: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 16380 1727204164.43779: in run() - task 12b410aa-8751-749c-b6eb-000000000272 16380 1727204164.44095: variable 'ansible_search_path' from source: unknown 16380 1727204164.44099: variable 'ansible_search_path' from source: unknown 16380 1727204164.44102: calling self._execute() 16380 1727204164.44416: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.44430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.44446: variable 'omit' from source: magic vars 16380 1727204164.45274: variable 'ansible_distribution_major_version' from source: facts 16380 1727204164.45294: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204164.45648: variable 'profile_stat' from source: set_fact 16380 1727204164.45669: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204164.45680: when evaluation is False, skipping this task 16380 1727204164.45687: _execute() done 16380 1727204164.45698: dumping result to json 16380 1727204164.45706: done dumping result, returning 16380 1727204164.45717: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000272] 16380 1727204164.45727: sending task result for task 12b410aa-8751-749c-b6eb-000000000272 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204164.45895: no more pending results, returning what we have 16380 1727204164.45900: results queue empty 16380 1727204164.45902: checking for any_errors_fatal 16380 1727204164.45913: done checking for any_errors_fatal 16380 1727204164.45914: checking for max_fail_percentage 16380 1727204164.45916: done checking for max_fail_percentage 16380 1727204164.45917: checking to see if all hosts have failed and the running result is not ok 16380 1727204164.45918: done checking to see if all hosts have failed 16380 1727204164.45920: getting the remaining hosts for this loop 16380 1727204164.45921: done getting the remaining hosts for this loop 16380 1727204164.45927: getting the next task for host managed-node2 16380 1727204164.45937: done getting next task for host managed-node2 16380 1727204164.45941: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 16380 1727204164.45948: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204164.45952: getting variables 16380 1727204164.45954: in VariableManager get_vars() 16380 1727204164.45988: Calling all_inventory to load vars for managed-node2 16380 1727204164.46119: Calling groups_inventory to load vars for managed-node2 16380 1727204164.46124: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204164.46133: done sending task result for task 12b410aa-8751-749c-b6eb-000000000272 16380 1727204164.46137: WORKER PROCESS EXITING 16380 1727204164.46152: Calling all_plugins_play to load vars for managed-node2 16380 1727204164.46155: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204164.46159: Calling groups_plugins_play to load vars for managed-node2 16380 1727204164.51108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204164.57405: done with get_vars() 16380 1727204164.57568: done getting variables 16380 1727204164.57644: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204164.58015: variable 'profile' from source: play vars 16380 1727204164.58019: variable 'interface' from source: set_fact 16380 1727204164.58199: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.161) 0:00:25.689 ***** 16380 1727204164.58244: entering _queue_task() for managed-node2/set_fact 16380 1727204164.59061: worker is 1 (out of 1 available) 16380 1727204164.59190: exiting _queue_task() for managed-node2/set_fact 16380 1727204164.59204: done queuing things up, now waiting for results queue to drain 16380 1727204164.59206: waiting for pending results... 16380 1727204164.59582: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 16380 1727204164.59825: in run() - task 12b410aa-8751-749c-b6eb-000000000273 16380 1727204164.59852: variable 'ansible_search_path' from source: unknown 16380 1727204164.59995: variable 'ansible_search_path' from source: unknown 16380 1727204164.59999: calling self._execute() 16380 1727204164.60028: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.60042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.60060: variable 'omit' from source: magic vars 16380 1727204164.60554: variable 'ansible_distribution_major_version' from source: facts 16380 1727204164.60573: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204164.60733: variable 'profile_stat' from source: set_fact 16380 1727204164.60756: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204164.60768: when evaluation is False, skipping this task 16380 1727204164.60776: _execute() done 16380 1727204164.60784: dumping result to json 16380 1727204164.60794: done dumping result, returning 16380 1727204164.60807: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000273] 16380 1727204164.60818: sending task result for task 12b410aa-8751-749c-b6eb-000000000273 16380 1727204164.61045: done sending task result for task 12b410aa-8751-749c-b6eb-000000000273 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204164.61101: no more pending results, returning what we have 16380 1727204164.61105: results queue empty 16380 1727204164.61106: checking for any_errors_fatal 16380 1727204164.61118: done checking for any_errors_fatal 16380 1727204164.61119: checking for max_fail_percentage 16380 1727204164.61120: done checking for max_fail_percentage 16380 1727204164.61121: checking to see if all hosts have failed and the running result is not ok 16380 1727204164.61122: done checking to see if all hosts have failed 16380 1727204164.61123: getting the remaining hosts for this loop 16380 1727204164.61125: done getting the remaining hosts for this loop 16380 1727204164.61129: getting the next task for host managed-node2 16380 1727204164.61137: done getting next task for host managed-node2 16380 1727204164.61140: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 16380 1727204164.61146: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204164.61149: getting variables 16380 1727204164.61151: in VariableManager get_vars() 16380 1727204164.61180: Calling all_inventory to load vars for managed-node2 16380 1727204164.61184: Calling groups_inventory to load vars for managed-node2 16380 1727204164.61187: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204164.61202: Calling all_plugins_play to load vars for managed-node2 16380 1727204164.61205: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204164.61211: Calling groups_plugins_play to load vars for managed-node2 16380 1727204164.61730: WORKER PROCESS EXITING 16380 1727204164.66139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204164.72637: done with get_vars() 16380 1727204164.72802: done getting variables 16380 1727204164.72885: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204164.73253: variable 'profile' from source: play vars 16380 1727204164.73258: variable 'interface' from source: set_fact 16380 1727204164.73388: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.151) 0:00:25.840 ***** 16380 1727204164.73431: entering _queue_task() for managed-node2/command 16380 1727204164.74256: worker is 1 (out of 1 available) 16380 1727204164.74271: exiting _queue_task() for managed-node2/command 16380 1727204164.74285: done queuing things up, now waiting for results queue to drain 16380 1727204164.74287: waiting for pending results... 16380 1727204164.74907: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 16380 1727204164.75296: in run() - task 12b410aa-8751-749c-b6eb-000000000274 16380 1727204164.75300: variable 'ansible_search_path' from source: unknown 16380 1727204164.75304: variable 'ansible_search_path' from source: unknown 16380 1727204164.75306: calling self._execute() 16380 1727204164.75309: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.75311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.75314: variable 'omit' from source: magic vars 16380 1727204164.76323: variable 'ansible_distribution_major_version' from source: facts 16380 1727204164.76343: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204164.76704: variable 'profile_stat' from source: set_fact 16380 1727204164.76730: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204164.76737: when evaluation is False, skipping this task 16380 1727204164.76747: _execute() done 16380 1727204164.76755: dumping result to json 16380 1727204164.76763: done dumping result, returning 16380 1727204164.76774: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000274] 16380 1727204164.76783: sending task result for task 12b410aa-8751-749c-b6eb-000000000274 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204164.77227: no more pending results, returning what we have 16380 1727204164.77231: results queue empty 16380 1727204164.77232: checking for any_errors_fatal 16380 1727204164.77238: done checking for any_errors_fatal 16380 1727204164.77239: checking for max_fail_percentage 16380 1727204164.77241: done checking for max_fail_percentage 16380 1727204164.77242: checking to see if all hosts have failed and the running result is not ok 16380 1727204164.77243: done checking to see if all hosts have failed 16380 1727204164.77244: getting the remaining hosts for this loop 16380 1727204164.77245: done getting the remaining hosts for this loop 16380 1727204164.77250: getting the next task for host managed-node2 16380 1727204164.77259: done getting next task for host managed-node2 16380 1727204164.77262: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 16380 1727204164.77379: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204164.77384: getting variables 16380 1727204164.77385: in VariableManager get_vars() 16380 1727204164.77419: Calling all_inventory to load vars for managed-node2 16380 1727204164.77423: Calling groups_inventory to load vars for managed-node2 16380 1727204164.77427: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204164.77441: Calling all_plugins_play to load vars for managed-node2 16380 1727204164.77444: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204164.77449: Calling groups_plugins_play to load vars for managed-node2 16380 1727204164.78047: done sending task result for task 12b410aa-8751-749c-b6eb-000000000274 16380 1727204164.78590: WORKER PROCESS EXITING 16380 1727204164.81683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204164.85337: done with get_vars() 16380 1727204164.85381: done getting variables 16380 1727204164.85461: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204164.85600: variable 'profile' from source: play vars 16380 1727204164.85605: variable 'interface' from source: set_fact 16380 1727204164.85680: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.122) 0:00:25.963 ***** 16380 1727204164.85722: entering _queue_task() for managed-node2/set_fact 16380 1727204164.86312: worker is 1 (out of 1 available) 16380 1727204164.86323: exiting _queue_task() for managed-node2/set_fact 16380 1727204164.86333: done queuing things up, now waiting for results queue to drain 16380 1727204164.86336: waiting for pending results... 16380 1727204164.86449: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 16380 1727204164.86596: in run() - task 12b410aa-8751-749c-b6eb-000000000275 16380 1727204164.86618: variable 'ansible_search_path' from source: unknown 16380 1727204164.86622: variable 'ansible_search_path' from source: unknown 16380 1727204164.86664: calling self._execute() 16380 1727204164.86770: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204164.86784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204164.86802: variable 'omit' from source: magic vars 16380 1727204164.87386: variable 'ansible_distribution_major_version' from source: facts 16380 1727204164.87442: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204164.87819: variable 'profile_stat' from source: set_fact 16380 1727204164.87998: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204164.88002: when evaluation is False, skipping this task 16380 1727204164.88005: _execute() done 16380 1727204164.88007: dumping result to json 16380 1727204164.88010: done dumping result, returning 16380 1727204164.88012: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000275] 16380 1727204164.88015: sending task result for task 12b410aa-8751-749c-b6eb-000000000275 16380 1727204164.88364: done sending task result for task 12b410aa-8751-749c-b6eb-000000000275 16380 1727204164.88368: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204164.88428: no more pending results, returning what we have 16380 1727204164.88432: results queue empty 16380 1727204164.88434: checking for any_errors_fatal 16380 1727204164.88440: done checking for any_errors_fatal 16380 1727204164.88441: checking for max_fail_percentage 16380 1727204164.88443: done checking for max_fail_percentage 16380 1727204164.88444: checking to see if all hosts have failed and the running result is not ok 16380 1727204164.88445: done checking to see if all hosts have failed 16380 1727204164.88446: getting the remaining hosts for this loop 16380 1727204164.88448: done getting the remaining hosts for this loop 16380 1727204164.88453: getting the next task for host managed-node2 16380 1727204164.88462: done getting next task for host managed-node2 16380 1727204164.88465: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 16380 1727204164.88469: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204164.88474: getting variables 16380 1727204164.88476: in VariableManager get_vars() 16380 1727204164.88515: Calling all_inventory to load vars for managed-node2 16380 1727204164.88519: Calling groups_inventory to load vars for managed-node2 16380 1727204164.88524: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204164.88540: Calling all_plugins_play to load vars for managed-node2 16380 1727204164.88543: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204164.88547: Calling groups_plugins_play to load vars for managed-node2 16380 1727204164.94129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204164.98660: done with get_vars() 16380 1727204164.98713: done getting variables 16380 1727204164.98805: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204164.98957: variable 'profile' from source: play vars 16380 1727204164.98962: variable 'interface' from source: set_fact 16380 1727204164.99038: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:56:04 -0400 (0:00:00.133) 0:00:26.097 ***** 16380 1727204164.99081: entering _queue_task() for managed-node2/assert 16380 1727204164.99471: worker is 1 (out of 1 available) 16380 1727204164.99485: exiting _queue_task() for managed-node2/assert 16380 1727204164.99608: done queuing things up, now waiting for results queue to drain 16380 1727204164.99611: waiting for pending results... 16380 1727204164.99815: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'LSR-TST-br31' 16380 1727204164.99966: in run() - task 12b410aa-8751-749c-b6eb-000000000260 16380 1727204164.99992: variable 'ansible_search_path' from source: unknown 16380 1727204165.00002: variable 'ansible_search_path' from source: unknown 16380 1727204165.00054: calling self._execute() 16380 1727204165.00177: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.00192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.00211: variable 'omit' from source: magic vars 16380 1727204165.00665: variable 'ansible_distribution_major_version' from source: facts 16380 1727204165.00684: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204165.00808: variable 'omit' from source: magic vars 16380 1727204165.00814: variable 'omit' from source: magic vars 16380 1727204165.00899: variable 'profile' from source: play vars 16380 1727204165.00911: variable 'interface' from source: set_fact 16380 1727204165.01028: variable 'interface' from source: set_fact 16380 1727204165.01034: variable 'omit' from source: magic vars 16380 1727204165.01085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204165.01139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204165.01175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204165.01246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.01254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.01280: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204165.01291: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.01301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.01438: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204165.01463: Set connection var ansible_shell_executable to /bin/sh 16380 1727204165.01477: Set connection var ansible_connection to ssh 16380 1727204165.01491: Set connection var ansible_shell_type to sh 16380 1727204165.01572: Set connection var ansible_pipelining to False 16380 1727204165.01576: Set connection var ansible_timeout to 10 16380 1727204165.01579: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.01583: variable 'ansible_connection' from source: unknown 16380 1727204165.01587: variable 'ansible_module_compression' from source: unknown 16380 1727204165.01589: variable 'ansible_shell_type' from source: unknown 16380 1727204165.01596: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.01598: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.01600: variable 'ansible_pipelining' from source: unknown 16380 1727204165.01604: variable 'ansible_timeout' from source: unknown 16380 1727204165.01615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.01792: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204165.01820: variable 'omit' from source: magic vars 16380 1727204165.01832: starting attempt loop 16380 1727204165.01839: running the handler 16380 1727204165.01978: variable 'lsr_net_profile_exists' from source: set_fact 16380 1727204165.02007: Evaluated conditional (lsr_net_profile_exists): True 16380 1727204165.02010: handler run complete 16380 1727204165.02096: attempt loop complete, returning result 16380 1727204165.02100: _execute() done 16380 1727204165.02102: dumping result to json 16380 1727204165.02105: done dumping result, returning 16380 1727204165.02108: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'LSR-TST-br31' [12b410aa-8751-749c-b6eb-000000000260] 16380 1727204165.02112: sending task result for task 12b410aa-8751-749c-b6eb-000000000260 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204165.02308: no more pending results, returning what we have 16380 1727204165.02312: results queue empty 16380 1727204165.02313: checking for any_errors_fatal 16380 1727204165.02322: done checking for any_errors_fatal 16380 1727204165.02323: checking for max_fail_percentage 16380 1727204165.02325: done checking for max_fail_percentage 16380 1727204165.02325: checking to see if all hosts have failed and the running result is not ok 16380 1727204165.02326: done checking to see if all hosts have failed 16380 1727204165.02328: getting the remaining hosts for this loop 16380 1727204165.02330: done getting the remaining hosts for this loop 16380 1727204165.02335: getting the next task for host managed-node2 16380 1727204165.02345: done getting next task for host managed-node2 16380 1727204165.02348: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 16380 1727204165.02353: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.02358: getting variables 16380 1727204165.02360: in VariableManager get_vars() 16380 1727204165.02395: Calling all_inventory to load vars for managed-node2 16380 1727204165.02399: Calling groups_inventory to load vars for managed-node2 16380 1727204165.02404: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.02418: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.02423: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.02427: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.03006: done sending task result for task 12b410aa-8751-749c-b6eb-000000000260 16380 1727204165.03010: WORKER PROCESS EXITING 16380 1727204165.05107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.14019: done with get_vars() 16380 1727204165.14066: done getting variables 16380 1727204165.14143: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204165.14281: variable 'profile' from source: play vars 16380 1727204165.14284: variable 'interface' from source: set_fact 16380 1727204165.14387: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:56:05 -0400 (0:00:00.153) 0:00:26.250 ***** 16380 1727204165.14427: entering _queue_task() for managed-node2/assert 16380 1727204165.15033: worker is 1 (out of 1 available) 16380 1727204165.15046: exiting _queue_task() for managed-node2/assert 16380 1727204165.15058: done queuing things up, now waiting for results queue to drain 16380 1727204165.15060: waiting for pending results... 16380 1727204165.15248: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 16380 1727204165.15388: in run() - task 12b410aa-8751-749c-b6eb-000000000261 16380 1727204165.15429: variable 'ansible_search_path' from source: unknown 16380 1727204165.15438: variable 'ansible_search_path' from source: unknown 16380 1727204165.15482: calling self._execute() 16380 1727204165.15600: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.15624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.15644: variable 'omit' from source: magic vars 16380 1727204165.16122: variable 'ansible_distribution_major_version' from source: facts 16380 1727204165.16141: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204165.16159: variable 'omit' from source: magic vars 16380 1727204165.16220: variable 'omit' from source: magic vars 16380 1727204165.16382: variable 'profile' from source: play vars 16380 1727204165.16387: variable 'interface' from source: set_fact 16380 1727204165.16460: variable 'interface' from source: set_fact 16380 1727204165.16501: variable 'omit' from source: magic vars 16380 1727204165.16598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204165.16602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204165.16631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204165.16658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.16677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.16729: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204165.16739: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.16816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.16956: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204165.16970: Set connection var ansible_shell_executable to /bin/sh 16380 1727204165.16990: Set connection var ansible_connection to ssh 16380 1727204165.17006: Set connection var ansible_shell_type to sh 16380 1727204165.17014: Set connection var ansible_pipelining to False 16380 1727204165.17022: Set connection var ansible_timeout to 10 16380 1727204165.17046: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.17052: variable 'ansible_connection' from source: unknown 16380 1727204165.17057: variable 'ansible_module_compression' from source: unknown 16380 1727204165.17060: variable 'ansible_shell_type' from source: unknown 16380 1727204165.17064: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.17068: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.17073: variable 'ansible_pipelining' from source: unknown 16380 1727204165.17077: variable 'ansible_timeout' from source: unknown 16380 1727204165.17082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.17225: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204165.17236: variable 'omit' from source: magic vars 16380 1727204165.17242: starting attempt loop 16380 1727204165.17247: running the handler 16380 1727204165.17339: variable 'lsr_net_profile_ansible_managed' from source: set_fact 16380 1727204165.17342: Evaluated conditional (lsr_net_profile_ansible_managed): True 16380 1727204165.17350: handler run complete 16380 1727204165.17368: attempt loop complete, returning result 16380 1727204165.17371: _execute() done 16380 1727204165.17373: dumping result to json 16380 1727204165.17379: done dumping result, returning 16380 1727204165.17388: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [12b410aa-8751-749c-b6eb-000000000261] 16380 1727204165.17395: sending task result for task 12b410aa-8751-749c-b6eb-000000000261 16380 1727204165.17481: done sending task result for task 12b410aa-8751-749c-b6eb-000000000261 16380 1727204165.17484: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204165.17542: no more pending results, returning what we have 16380 1727204165.17546: results queue empty 16380 1727204165.17547: checking for any_errors_fatal 16380 1727204165.17555: done checking for any_errors_fatal 16380 1727204165.17556: checking for max_fail_percentage 16380 1727204165.17557: done checking for max_fail_percentage 16380 1727204165.17558: checking to see if all hosts have failed and the running result is not ok 16380 1727204165.17559: done checking to see if all hosts have failed 16380 1727204165.17560: getting the remaining hosts for this loop 16380 1727204165.17562: done getting the remaining hosts for this loop 16380 1727204165.17567: getting the next task for host managed-node2 16380 1727204165.17574: done getting next task for host managed-node2 16380 1727204165.17577: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 16380 1727204165.17580: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.17584: getting variables 16380 1727204165.17586: in VariableManager get_vars() 16380 1727204165.17620: Calling all_inventory to load vars for managed-node2 16380 1727204165.17623: Calling groups_inventory to load vars for managed-node2 16380 1727204165.17628: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.17639: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.17643: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.17646: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.19586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.22390: done with get_vars() 16380 1727204165.22423: done getting variables 16380 1727204165.22473: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204165.22573: variable 'profile' from source: play vars 16380 1727204165.22577: variable 'interface' from source: set_fact 16380 1727204165.22629: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:56:05 -0400 (0:00:00.082) 0:00:26.333 ***** 16380 1727204165.22659: entering _queue_task() for managed-node2/assert 16380 1727204165.22976: worker is 1 (out of 1 available) 16380 1727204165.22993: exiting _queue_task() for managed-node2/assert 16380 1727204165.23007: done queuing things up, now waiting for results queue to drain 16380 1727204165.23011: waiting for pending results... 16380 1727204165.23279: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 16380 1727204165.23375: in run() - task 12b410aa-8751-749c-b6eb-000000000262 16380 1727204165.23398: variable 'ansible_search_path' from source: unknown 16380 1727204165.23403: variable 'ansible_search_path' from source: unknown 16380 1727204165.23432: calling self._execute() 16380 1727204165.23515: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.23519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.23529: variable 'omit' from source: magic vars 16380 1727204165.23876: variable 'ansible_distribution_major_version' from source: facts 16380 1727204165.23886: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204165.23895: variable 'omit' from source: magic vars 16380 1727204165.23934: variable 'omit' from source: magic vars 16380 1727204165.24104: variable 'profile' from source: play vars 16380 1727204165.24108: variable 'interface' from source: set_fact 16380 1727204165.24114: variable 'interface' from source: set_fact 16380 1727204165.24150: variable 'omit' from source: magic vars 16380 1727204165.24207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204165.24348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204165.24351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204165.24354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.24356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.24359: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204165.24362: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.24364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.24597: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204165.24600: Set connection var ansible_shell_executable to /bin/sh 16380 1727204165.24603: Set connection var ansible_connection to ssh 16380 1727204165.24606: Set connection var ansible_shell_type to sh 16380 1727204165.24608: Set connection var ansible_pipelining to False 16380 1727204165.24613: Set connection var ansible_timeout to 10 16380 1727204165.24616: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.24618: variable 'ansible_connection' from source: unknown 16380 1727204165.24620: variable 'ansible_module_compression' from source: unknown 16380 1727204165.24622: variable 'ansible_shell_type' from source: unknown 16380 1727204165.24624: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.24626: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.24628: variable 'ansible_pipelining' from source: unknown 16380 1727204165.24631: variable 'ansible_timeout' from source: unknown 16380 1727204165.24633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.24726: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204165.24731: variable 'omit' from source: magic vars 16380 1727204165.24797: starting attempt loop 16380 1727204165.24805: running the handler 16380 1727204165.24869: variable 'lsr_net_profile_fingerprint' from source: set_fact 16380 1727204165.24875: Evaluated conditional (lsr_net_profile_fingerprint): True 16380 1727204165.24916: handler run complete 16380 1727204165.24920: attempt loop complete, returning result 16380 1727204165.24922: _execute() done 16380 1727204165.24974: dumping result to json 16380 1727204165.24978: done dumping result, returning 16380 1727204165.24984: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000262] 16380 1727204165.24987: sending task result for task 12b410aa-8751-749c-b6eb-000000000262 16380 1727204165.25054: done sending task result for task 12b410aa-8751-749c-b6eb-000000000262 16380 1727204165.25057: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204165.25124: no more pending results, returning what we have 16380 1727204165.25127: results queue empty 16380 1727204165.25128: checking for any_errors_fatal 16380 1727204165.25137: done checking for any_errors_fatal 16380 1727204165.25138: checking for max_fail_percentage 16380 1727204165.25139: done checking for max_fail_percentage 16380 1727204165.25141: checking to see if all hosts have failed and the running result is not ok 16380 1727204165.25142: done checking to see if all hosts have failed 16380 1727204165.25143: getting the remaining hosts for this loop 16380 1727204165.25145: done getting the remaining hosts for this loop 16380 1727204165.25149: getting the next task for host managed-node2 16380 1727204165.25158: done getting next task for host managed-node2 16380 1727204165.25160: ^ task is: TASK: meta (flush_handlers) 16380 1727204165.25162: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.25167: getting variables 16380 1727204165.25169: in VariableManager get_vars() 16380 1727204165.25332: Calling all_inventory to load vars for managed-node2 16380 1727204165.25339: Calling groups_inventory to load vars for managed-node2 16380 1727204165.25347: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.25358: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.25362: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.25367: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.27278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.29746: done with get_vars() 16380 1727204165.29784: done getting variables 16380 1727204165.29882: in VariableManager get_vars() 16380 1727204165.29894: Calling all_inventory to load vars for managed-node2 16380 1727204165.29896: Calling groups_inventory to load vars for managed-node2 16380 1727204165.29898: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.29903: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.29905: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.29907: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.31752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.33911: done with get_vars() 16380 1727204165.33937: done queuing things up, now waiting for results queue to drain 16380 1727204165.33938: results queue empty 16380 1727204165.33939: checking for any_errors_fatal 16380 1727204165.33942: done checking for any_errors_fatal 16380 1727204165.33943: checking for max_fail_percentage 16380 1727204165.33944: done checking for max_fail_percentage 16380 1727204165.33948: checking to see if all hosts have failed and the running result is not ok 16380 1727204165.33949: done checking to see if all hosts have failed 16380 1727204165.33949: getting the remaining hosts for this loop 16380 1727204165.33950: done getting the remaining hosts for this loop 16380 1727204165.33952: getting the next task for host managed-node2 16380 1727204165.33956: done getting next task for host managed-node2 16380 1727204165.33957: ^ task is: TASK: meta (flush_handlers) 16380 1727204165.33958: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.33960: getting variables 16380 1727204165.33961: in VariableManager get_vars() 16380 1727204165.33968: Calling all_inventory to load vars for managed-node2 16380 1727204165.33969: Calling groups_inventory to load vars for managed-node2 16380 1727204165.33972: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.33978: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.33980: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.33982: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.35096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.37394: done with get_vars() 16380 1727204165.37440: done getting variables 16380 1727204165.37540: in VariableManager get_vars() 16380 1727204165.37556: Calling all_inventory to load vars for managed-node2 16380 1727204165.37560: Calling groups_inventory to load vars for managed-node2 16380 1727204165.37564: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.37573: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.37577: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.37581: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.39348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.41111: done with get_vars() 16380 1727204165.41163: done queuing things up, now waiting for results queue to drain 16380 1727204165.41171: results queue empty 16380 1727204165.41172: checking for any_errors_fatal 16380 1727204165.41174: done checking for any_errors_fatal 16380 1727204165.41175: checking for max_fail_percentage 16380 1727204165.41176: done checking for max_fail_percentage 16380 1727204165.41177: checking to see if all hosts have failed and the running result is not ok 16380 1727204165.41178: done checking to see if all hosts have failed 16380 1727204165.41179: getting the remaining hosts for this loop 16380 1727204165.41180: done getting the remaining hosts for this loop 16380 1727204165.41187: getting the next task for host managed-node2 16380 1727204165.41195: done getting next task for host managed-node2 16380 1727204165.41196: ^ task is: None 16380 1727204165.41198: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.41199: done queuing things up, now waiting for results queue to drain 16380 1727204165.41201: results queue empty 16380 1727204165.41202: checking for any_errors_fatal 16380 1727204165.41203: done checking for any_errors_fatal 16380 1727204165.41204: checking for max_fail_percentage 16380 1727204165.41205: done checking for max_fail_percentage 16380 1727204165.41206: checking to see if all hosts have failed and the running result is not ok 16380 1727204165.41207: done checking to see if all hosts have failed 16380 1727204165.41211: getting the next task for host managed-node2 16380 1727204165.41215: done getting next task for host managed-node2 16380 1727204165.41216: ^ task is: None 16380 1727204165.41218: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.41267: in VariableManager get_vars() 16380 1727204165.41291: done with get_vars() 16380 1727204165.41297: in VariableManager get_vars() 16380 1727204165.41308: done with get_vars() 16380 1727204165.41314: variable 'omit' from source: magic vars 16380 1727204165.41420: variable 'profile' from source: play vars 16380 1727204165.41569: in VariableManager get_vars() 16380 1727204165.41600: done with get_vars() 16380 1727204165.41649: variable 'omit' from source: magic vars 16380 1727204165.41756: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 16380 1727204165.42727: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204165.42758: getting the remaining hosts for this loop 16380 1727204165.42759: done getting the remaining hosts for this loop 16380 1727204165.42761: getting the next task for host managed-node2 16380 1727204165.42764: done getting next task for host managed-node2 16380 1727204165.42765: ^ task is: TASK: Gathering Facts 16380 1727204165.42766: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204165.42768: getting variables 16380 1727204165.42768: in VariableManager get_vars() 16380 1727204165.42850: Calling all_inventory to load vars for managed-node2 16380 1727204165.42852: Calling groups_inventory to load vars for managed-node2 16380 1727204165.42854: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204165.42859: Calling all_plugins_play to load vars for managed-node2 16380 1727204165.42861: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204165.42864: Calling groups_plugins_play to load vars for managed-node2 16380 1727204165.45561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204165.47987: done with get_vars() 16380 1727204165.48053: done getting variables 16380 1727204165.48195: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:56:05 -0400 (0:00:00.255) 0:00:26.588 ***** 16380 1727204165.48231: entering _queue_task() for managed-node2/gather_facts 16380 1727204165.48595: worker is 1 (out of 1 available) 16380 1727204165.48608: exiting _queue_task() for managed-node2/gather_facts 16380 1727204165.48624: done queuing things up, now waiting for results queue to drain 16380 1727204165.48626: waiting for pending results... 16380 1727204165.49007: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204165.49035: in run() - task 12b410aa-8751-749c-b6eb-0000000002b5 16380 1727204165.49057: variable 'ansible_search_path' from source: unknown 16380 1727204165.49107: calling self._execute() 16380 1727204165.49234: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.49248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.49264: variable 'omit' from source: magic vars 16380 1727204165.49719: variable 'ansible_distribution_major_version' from source: facts 16380 1727204165.49739: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204165.49752: variable 'omit' from source: magic vars 16380 1727204165.50088: variable 'omit' from source: magic vars 16380 1727204165.50091: variable 'omit' from source: magic vars 16380 1727204165.50096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204165.50148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204165.50239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204165.50265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.50287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204165.50367: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204165.50438: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.50447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.50759: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204165.50773: Set connection var ansible_shell_executable to /bin/sh 16380 1727204165.50785: Set connection var ansible_connection to ssh 16380 1727204165.50799: Set connection var ansible_shell_type to sh 16380 1727204165.50994: Set connection var ansible_pipelining to False 16380 1727204165.50998: Set connection var ansible_timeout to 10 16380 1727204165.51000: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.51003: variable 'ansible_connection' from source: unknown 16380 1727204165.51006: variable 'ansible_module_compression' from source: unknown 16380 1727204165.51008: variable 'ansible_shell_type' from source: unknown 16380 1727204165.51013: variable 'ansible_shell_executable' from source: unknown 16380 1727204165.51015: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204165.51017: variable 'ansible_pipelining' from source: unknown 16380 1727204165.51020: variable 'ansible_timeout' from source: unknown 16380 1727204165.51022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204165.51370: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204165.51388: variable 'omit' from source: magic vars 16380 1727204165.51402: starting attempt loop 16380 1727204165.51413: running the handler 16380 1727204165.51434: variable 'ansible_facts' from source: unknown 16380 1727204165.51470: _low_level_execute_command(): starting 16380 1727204165.51484: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204165.52312: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204165.52393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204165.52414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204165.52491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204165.54316: stdout chunk (state=3): >>>/root <<< 16380 1727204165.54514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204165.54520: stdout chunk (state=3): >>><<< 16380 1727204165.54526: stderr chunk (state=3): >>><<< 16380 1727204165.54746: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204165.54750: _low_level_execute_command(): starting 16380 1727204165.54753: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189 `" && echo ansible-tmp-1727204165.5465534-18297-60669228686189="` echo /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189 `" ) && sleep 0' 16380 1727204165.55820: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204165.56107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204165.56312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204165.56506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204165.58605: stdout chunk (state=3): >>>ansible-tmp-1727204165.5465534-18297-60669228686189=/root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189 <<< 16380 1727204165.58817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204165.58828: stdout chunk (state=3): >>><<< 16380 1727204165.58840: stderr chunk (state=3): >>><<< 16380 1727204165.58867: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204165.5465534-18297-60669228686189=/root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204165.59295: variable 'ansible_module_compression' from source: unknown 16380 1727204165.59298: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204165.59300: variable 'ansible_facts' from source: unknown 16380 1727204165.59448: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py 16380 1727204165.60117: Sending initial data 16380 1727204165.60129: Sent initial data (153 bytes) 16380 1727204165.60923: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204165.60937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204165.60951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204165.61226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204165.61229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204165.61318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204165.61379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204165.63154: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204165.63188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204165.63232: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp7m1k_dq3 /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py <<< 16380 1727204165.63243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py" <<< 16380 1727204165.63317: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp7m1k_dq3" to remote "/root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py" <<< 16380 1727204165.66426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204165.66527: stderr chunk (state=3): >>><<< 16380 1727204165.66539: stdout chunk (state=3): >>><<< 16380 1727204165.66586: done transferring module to remote 16380 1727204165.66678: _low_level_execute_command(): starting 16380 1727204165.66693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/ /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py && sleep 0' 16380 1727204165.67932: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204165.67950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204165.67999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204165.68437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204165.68707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204165.68772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204165.70757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204165.70857: stderr chunk (state=3): >>><<< 16380 1727204165.70867: stdout chunk (state=3): >>><<< 16380 1727204165.70956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204165.70965: _low_level_execute_command(): starting 16380 1727204165.70968: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/AnsiballZ_setup.py && sleep 0' 16380 1727204165.72306: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204165.72407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204165.72426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204165.72505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204166.41138: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_hostnqn": "", "ansible_python": <<< 16380 1727204166.41161: stdout chunk (state=3): >>>{"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.43115234375, "5m": 0.52099609375, "15m": 0.33837890625}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "06", "epoch": "1727204166", "epoch_int": "1727204166", "date": "2024-09-24", "time": "14:56:06", "iso8601_micro": "2024-09-24T18:56:06.045246Z", "iso8601": "2024-09-24T18:56:06Z", "iso8601_basic": "20240924T145606045246", "iso8601_basic_short": "20240924T145606", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2838, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 879, "free": 2838}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 670, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155947520, "block_size": 4096, "block_total": 64479564, "block_available": 61317370, "block_used": 3162194, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "LSR-TST-br31", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:07:3c:a4:a3:3f", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204166.43367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204166.43450: stderr chunk (state=3): >>><<< 16380 1727204166.43454: stdout chunk (state=3): >>><<< 16380 1727204166.43481: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.43115234375, "5m": 0.52099609375, "15m": 0.33837890625}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "06", "epoch": "1727204166", "epoch_int": "1727204166", "date": "2024-09-24", "time": "14:56:06", "iso8601_micro": "2024-09-24T18:56:06.045246Z", "iso8601": "2024-09-24T18:56:06Z", "iso8601_basic": "20240924T145606045246", "iso8601_basic_short": "20240924T145606", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2838, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 879, "free": 2838}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 670, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251155947520, "block_size": 4096, "block_total": 64479564, "block_available": 61317370, "block_used": 3162194, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "LSR-TST-br31", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:07:3c:a4:a3:3f", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204166.43778: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204166.43791: _low_level_execute_command(): starting 16380 1727204166.43797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204165.5465534-18297-60669228686189/ > /dev/null 2>&1 && sleep 0' 16380 1727204166.44618: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204166.44758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204166.44812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204166.44872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204166.45022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204166.46957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204166.47005: stderr chunk (state=3): >>><<< 16380 1727204166.47008: stdout chunk (state=3): >>><<< 16380 1727204166.47030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204166.47037: handler run complete 16380 1727204166.47219: variable 'ansible_facts' from source: unknown 16380 1727204166.47612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.48176: variable 'ansible_facts' from source: unknown 16380 1727204166.48394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.48595: attempt loop complete, returning result 16380 1727204166.48613: _execute() done 16380 1727204166.48629: dumping result to json 16380 1727204166.48873: done dumping result, returning 16380 1727204166.48877: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-0000000002b5] 16380 1727204166.48880: sending task result for task 12b410aa-8751-749c-b6eb-0000000002b5 ok: [managed-node2] 16380 1727204166.49837: no more pending results, returning what we have 16380 1727204166.49841: results queue empty 16380 1727204166.49842: checking for any_errors_fatal 16380 1727204166.49844: done checking for any_errors_fatal 16380 1727204166.49844: checking for max_fail_percentage 16380 1727204166.49846: done checking for max_fail_percentage 16380 1727204166.49847: checking to see if all hosts have failed and the running result is not ok 16380 1727204166.49848: done checking to see if all hosts have failed 16380 1727204166.49851: getting the remaining hosts for this loop 16380 1727204166.49853: done getting the remaining hosts for this loop 16380 1727204166.49858: getting the next task for host managed-node2 16380 1727204166.49904: done getting next task for host managed-node2 16380 1727204166.49907: ^ task is: TASK: meta (flush_handlers) 16380 1727204166.49912: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204166.49916: getting variables 16380 1727204166.49918: in VariableManager get_vars() 16380 1727204166.49953: Calling all_inventory to load vars for managed-node2 16380 1727204166.49957: Calling groups_inventory to load vars for managed-node2 16380 1727204166.49960: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.49967: done sending task result for task 12b410aa-8751-749c-b6eb-0000000002b5 16380 1727204166.49970: WORKER PROCESS EXITING 16380 1727204166.49981: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.49985: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.49998: Calling groups_plugins_play to load vars for managed-node2 16380 1727204166.51869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.55542: done with get_vars() 16380 1727204166.55577: done getting variables 16380 1727204166.55663: in VariableManager get_vars() 16380 1727204166.55680: Calling all_inventory to load vars for managed-node2 16380 1727204166.55683: Calling groups_inventory to load vars for managed-node2 16380 1727204166.55686: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.55695: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.55698: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.55702: Calling groups_plugins_play to load vars for managed-node2 16380 1727204166.58128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.62157: done with get_vars() 16380 1727204166.62205: done queuing things up, now waiting for results queue to drain 16380 1727204166.62208: results queue empty 16380 1727204166.62211: checking for any_errors_fatal 16380 1727204166.62217: done checking for any_errors_fatal 16380 1727204166.62218: checking for max_fail_percentage 16380 1727204166.62219: done checking for max_fail_percentage 16380 1727204166.62220: checking to see if all hosts have failed and the running result is not ok 16380 1727204166.62221: done checking to see if all hosts have failed 16380 1727204166.62222: getting the remaining hosts for this loop 16380 1727204166.62228: done getting the remaining hosts for this loop 16380 1727204166.62231: getting the next task for host managed-node2 16380 1727204166.62236: done getting next task for host managed-node2 16380 1727204166.62244: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16380 1727204166.62246: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204166.62258: getting variables 16380 1727204166.62260: in VariableManager get_vars() 16380 1727204166.62283: Calling all_inventory to load vars for managed-node2 16380 1727204166.62318: Calling groups_inventory to load vars for managed-node2 16380 1727204166.62343: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.62403: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.62408: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.62455: Calling groups_plugins_play to load vars for managed-node2 16380 1727204166.65958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.69309: done with get_vars() 16380 1727204166.69360: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:06 -0400 (0:00:01.214) 0:00:27.803 ***** 16380 1727204166.69702: entering _queue_task() for managed-node2/include_tasks 16380 1727204166.70452: worker is 1 (out of 1 available) 16380 1727204166.70466: exiting _queue_task() for managed-node2/include_tasks 16380 1727204166.70481: done queuing things up, now waiting for results queue to drain 16380 1727204166.70484: waiting for pending results... 16380 1727204166.70740: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16380 1727204166.71003: in run() - task 12b410aa-8751-749c-b6eb-00000000003a 16380 1727204166.71008: variable 'ansible_search_path' from source: unknown 16380 1727204166.71011: variable 'ansible_search_path' from source: unknown 16380 1727204166.71014: calling self._execute() 16380 1727204166.71293: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204166.71308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204166.71325: variable 'omit' from source: magic vars 16380 1727204166.71894: variable 'ansible_distribution_major_version' from source: facts 16380 1727204166.71919: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204166.71932: _execute() done 16380 1727204166.71942: dumping result to json 16380 1727204166.71951: done dumping result, returning 16380 1727204166.71962: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-749c-b6eb-00000000003a] 16380 1727204166.71972: sending task result for task 12b410aa-8751-749c-b6eb-00000000003a 16380 1727204166.72165: no more pending results, returning what we have 16380 1727204166.72170: in VariableManager get_vars() 16380 1727204166.72224: Calling all_inventory to load vars for managed-node2 16380 1727204166.72228: Calling groups_inventory to load vars for managed-node2 16380 1727204166.72231: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.72247: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.72250: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.72254: Calling groups_plugins_play to load vars for managed-node2 16380 1727204166.73006: done sending task result for task 12b410aa-8751-749c-b6eb-00000000003a 16380 1727204166.73010: WORKER PROCESS EXITING 16380 1727204166.75013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.79084: done with get_vars() 16380 1727204166.79116: variable 'ansible_search_path' from source: unknown 16380 1727204166.79118: variable 'ansible_search_path' from source: unknown 16380 1727204166.79155: we have included files to process 16380 1727204166.79157: generating all_blocks data 16380 1727204166.79158: done generating all_blocks data 16380 1727204166.79159: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204166.79161: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204166.79164: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204166.80115: done processing included file 16380 1727204166.80117: iterating over new_blocks loaded from include file 16380 1727204166.80119: in VariableManager get_vars() 16380 1727204166.80146: done with get_vars() 16380 1727204166.80148: filtering new block on tags 16380 1727204166.80170: done filtering new block on tags 16380 1727204166.80174: in VariableManager get_vars() 16380 1727204166.80201: done with get_vars() 16380 1727204166.80203: filtering new block on tags 16380 1727204166.80227: done filtering new block on tags 16380 1727204166.80230: in VariableManager get_vars() 16380 1727204166.80254: done with get_vars() 16380 1727204166.80256: filtering new block on tags 16380 1727204166.80278: done filtering new block on tags 16380 1727204166.80281: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16380 1727204166.80287: extending task lists for all hosts with included blocks 16380 1727204166.80777: done extending task lists 16380 1727204166.80779: done processing included files 16380 1727204166.80780: results queue empty 16380 1727204166.80781: checking for any_errors_fatal 16380 1727204166.80783: done checking for any_errors_fatal 16380 1727204166.80784: checking for max_fail_percentage 16380 1727204166.80786: done checking for max_fail_percentage 16380 1727204166.80787: checking to see if all hosts have failed and the running result is not ok 16380 1727204166.80788: done checking to see if all hosts have failed 16380 1727204166.80790: getting the remaining hosts for this loop 16380 1727204166.80792: done getting the remaining hosts for this loop 16380 1727204166.80795: getting the next task for host managed-node2 16380 1727204166.80800: done getting next task for host managed-node2 16380 1727204166.80804: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16380 1727204166.80807: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204166.80819: getting variables 16380 1727204166.80820: in VariableManager get_vars() 16380 1727204166.80838: Calling all_inventory to load vars for managed-node2 16380 1727204166.80841: Calling groups_inventory to load vars for managed-node2 16380 1727204166.80844: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.80851: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.80854: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.80858: Calling groups_plugins_play to load vars for managed-node2 16380 1727204166.82869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.85735: done with get_vars() 16380 1727204166.85775: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:06 -0400 (0:00:00.161) 0:00:27.965 ***** 16380 1727204166.85874: entering _queue_task() for managed-node2/setup 16380 1727204166.86260: worker is 1 (out of 1 available) 16380 1727204166.86275: exiting _queue_task() for managed-node2/setup 16380 1727204166.86290: done queuing things up, now waiting for results queue to drain 16380 1727204166.86494: waiting for pending results... 16380 1727204166.86595: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16380 1727204166.86758: in run() - task 12b410aa-8751-749c-b6eb-0000000002f6 16380 1727204166.86782: variable 'ansible_search_path' from source: unknown 16380 1727204166.86793: variable 'ansible_search_path' from source: unknown 16380 1727204166.86842: calling self._execute() 16380 1727204166.86953: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204166.86967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204166.86984: variable 'omit' from source: magic vars 16380 1727204166.87429: variable 'ansible_distribution_major_version' from source: facts 16380 1727204166.87449: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204166.87739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204166.89908: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204166.89965: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204166.90004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204166.90037: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204166.90060: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204166.90136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204166.90159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204166.90180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204166.90223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204166.90236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204166.90281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204166.90304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204166.90330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204166.90361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204166.90373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204166.90565: variable '__network_required_facts' from source: role '' defaults 16380 1727204166.90569: variable 'ansible_facts' from source: unknown 16380 1727204166.91846: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16380 1727204166.91850: when evaluation is False, skipping this task 16380 1727204166.91853: _execute() done 16380 1727204166.91856: dumping result to json 16380 1727204166.91859: done dumping result, returning 16380 1727204166.91868: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-749c-b6eb-0000000002f6] 16380 1727204166.91875: sending task result for task 12b410aa-8751-749c-b6eb-0000000002f6 16380 1727204166.91978: done sending task result for task 12b410aa-8751-749c-b6eb-0000000002f6 16380 1727204166.91982: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204166.92052: no more pending results, returning what we have 16380 1727204166.92056: results queue empty 16380 1727204166.92057: checking for any_errors_fatal 16380 1727204166.92059: done checking for any_errors_fatal 16380 1727204166.92059: checking for max_fail_percentage 16380 1727204166.92061: done checking for max_fail_percentage 16380 1727204166.92062: checking to see if all hosts have failed and the running result is not ok 16380 1727204166.92063: done checking to see if all hosts have failed 16380 1727204166.92064: getting the remaining hosts for this loop 16380 1727204166.92066: done getting the remaining hosts for this loop 16380 1727204166.92070: getting the next task for host managed-node2 16380 1727204166.92080: done getting next task for host managed-node2 16380 1727204166.92084: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16380 1727204166.92088: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204166.92107: getting variables 16380 1727204166.92111: in VariableManager get_vars() 16380 1727204166.92154: Calling all_inventory to load vars for managed-node2 16380 1727204166.92158: Calling groups_inventory to load vars for managed-node2 16380 1727204166.92160: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.92171: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.92174: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.92177: Calling groups_plugins_play to load vars for managed-node2 16380 1727204166.93546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204166.96022: done with get_vars() 16380 1727204166.96055: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:06 -0400 (0:00:00.102) 0:00:28.068 ***** 16380 1727204166.96144: entering _queue_task() for managed-node2/stat 16380 1727204166.96421: worker is 1 (out of 1 available) 16380 1727204166.96434: exiting _queue_task() for managed-node2/stat 16380 1727204166.96449: done queuing things up, now waiting for results queue to drain 16380 1727204166.96452: waiting for pending results... 16380 1727204166.96633: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16380 1727204166.96745: in run() - task 12b410aa-8751-749c-b6eb-0000000002f8 16380 1727204166.96758: variable 'ansible_search_path' from source: unknown 16380 1727204166.96762: variable 'ansible_search_path' from source: unknown 16380 1727204166.96800: calling self._execute() 16380 1727204166.96876: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204166.96883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204166.96899: variable 'omit' from source: magic vars 16380 1727204166.97230: variable 'ansible_distribution_major_version' from source: facts 16380 1727204166.97237: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204166.97376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204166.97796: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204166.97799: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204166.97801: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204166.97843: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204166.97963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204166.98003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204166.98059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204166.98099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204166.98230: variable '__network_is_ostree' from source: set_fact 16380 1727204166.98359: Evaluated conditional (not __network_is_ostree is defined): False 16380 1727204166.98363: when evaluation is False, skipping this task 16380 1727204166.98365: _execute() done 16380 1727204166.98368: dumping result to json 16380 1727204166.98370: done dumping result, returning 16380 1727204166.98373: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-749c-b6eb-0000000002f8] 16380 1727204166.98375: sending task result for task 12b410aa-8751-749c-b6eb-0000000002f8 16380 1727204166.98450: done sending task result for task 12b410aa-8751-749c-b6eb-0000000002f8 16380 1727204166.98456: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16380 1727204166.98527: no more pending results, returning what we have 16380 1727204166.98532: results queue empty 16380 1727204166.98533: checking for any_errors_fatal 16380 1727204166.98541: done checking for any_errors_fatal 16380 1727204166.98542: checking for max_fail_percentage 16380 1727204166.98544: done checking for max_fail_percentage 16380 1727204166.98546: checking to see if all hosts have failed and the running result is not ok 16380 1727204166.98547: done checking to see if all hosts have failed 16380 1727204166.98548: getting the remaining hosts for this loop 16380 1727204166.98550: done getting the remaining hosts for this loop 16380 1727204166.98555: getting the next task for host managed-node2 16380 1727204166.98571: done getting next task for host managed-node2 16380 1727204166.98575: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16380 1727204166.98578: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204166.98597: getting variables 16380 1727204166.98599: in VariableManager get_vars() 16380 1727204166.98647: Calling all_inventory to load vars for managed-node2 16380 1727204166.98651: Calling groups_inventory to load vars for managed-node2 16380 1727204166.98654: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204166.98670: Calling all_plugins_play to load vars for managed-node2 16380 1727204166.98676: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204166.98681: Calling groups_plugins_play to load vars for managed-node2 16380 1727204167.01108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204167.02811: done with get_vars() 16380 1727204167.02836: done getting variables 16380 1727204167.02892: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:07 -0400 (0:00:00.067) 0:00:28.135 ***** 16380 1727204167.02939: entering _queue_task() for managed-node2/set_fact 16380 1727204167.03344: worker is 1 (out of 1 available) 16380 1727204167.03362: exiting _queue_task() for managed-node2/set_fact 16380 1727204167.03376: done queuing things up, now waiting for results queue to drain 16380 1727204167.03378: waiting for pending results... 16380 1727204167.04316: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16380 1727204167.04392: in run() - task 12b410aa-8751-749c-b6eb-0000000002f9 16380 1727204167.04477: variable 'ansible_search_path' from source: unknown 16380 1727204167.04493: variable 'ansible_search_path' from source: unknown 16380 1727204167.04553: calling self._execute() 16380 1727204167.04695: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204167.04713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204167.04733: variable 'omit' from source: magic vars 16380 1727204167.05286: variable 'ansible_distribution_major_version' from source: facts 16380 1727204167.05296: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204167.05627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204167.05960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204167.06029: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204167.06083: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204167.06137: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204167.06280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204167.06307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204167.06389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204167.06397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204167.06528: variable '__network_is_ostree' from source: set_fact 16380 1727204167.06544: Evaluated conditional (not __network_is_ostree is defined): False 16380 1727204167.06553: when evaluation is False, skipping this task 16380 1727204167.06563: _execute() done 16380 1727204167.06573: dumping result to json 16380 1727204167.06597: done dumping result, returning 16380 1727204167.06615: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-749c-b6eb-0000000002f9] 16380 1727204167.06718: sending task result for task 12b410aa-8751-749c-b6eb-0000000002f9 16380 1727204167.06812: done sending task result for task 12b410aa-8751-749c-b6eb-0000000002f9 16380 1727204167.06818: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16380 1727204167.06887: no more pending results, returning what we have 16380 1727204167.06893: results queue empty 16380 1727204167.06894: checking for any_errors_fatal 16380 1727204167.06901: done checking for any_errors_fatal 16380 1727204167.06902: checking for max_fail_percentage 16380 1727204167.06904: done checking for max_fail_percentage 16380 1727204167.06905: checking to see if all hosts have failed and the running result is not ok 16380 1727204167.06907: done checking to see if all hosts have failed 16380 1727204167.06908: getting the remaining hosts for this loop 16380 1727204167.06912: done getting the remaining hosts for this loop 16380 1727204167.06918: getting the next task for host managed-node2 16380 1727204167.06931: done getting next task for host managed-node2 16380 1727204167.06935: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16380 1727204167.06939: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204167.06964: getting variables 16380 1727204167.06966: in VariableManager get_vars() 16380 1727204167.07101: Calling all_inventory to load vars for managed-node2 16380 1727204167.07106: Calling groups_inventory to load vars for managed-node2 16380 1727204167.07111: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204167.07126: Calling all_plugins_play to load vars for managed-node2 16380 1727204167.07131: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204167.07135: Calling groups_plugins_play to load vars for managed-node2 16380 1727204167.10116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204167.12456: done with get_vars() 16380 1727204167.12493: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:07 -0400 (0:00:00.096) 0:00:28.232 ***** 16380 1727204167.12579: entering _queue_task() for managed-node2/service_facts 16380 1727204167.12864: worker is 1 (out of 1 available) 16380 1727204167.12882: exiting _queue_task() for managed-node2/service_facts 16380 1727204167.12897: done queuing things up, now waiting for results queue to drain 16380 1727204167.12899: waiting for pending results... 16380 1727204167.13101: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16380 1727204167.13216: in run() - task 12b410aa-8751-749c-b6eb-0000000002fb 16380 1727204167.13235: variable 'ansible_search_path' from source: unknown 16380 1727204167.13240: variable 'ansible_search_path' from source: unknown 16380 1727204167.13271: calling self._execute() 16380 1727204167.13357: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204167.13361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204167.13374: variable 'omit' from source: magic vars 16380 1727204167.13702: variable 'ansible_distribution_major_version' from source: facts 16380 1727204167.13776: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204167.13780: variable 'omit' from source: magic vars 16380 1727204167.13785: variable 'omit' from source: magic vars 16380 1727204167.13803: variable 'omit' from source: magic vars 16380 1727204167.13842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204167.13874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204167.13895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204167.13917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204167.13928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204167.13979: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204167.13982: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204167.13985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204167.14072: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204167.14078: Set connection var ansible_shell_executable to /bin/sh 16380 1727204167.14085: Set connection var ansible_connection to ssh 16380 1727204167.14093: Set connection var ansible_shell_type to sh 16380 1727204167.14100: Set connection var ansible_pipelining to False 16380 1727204167.14116: Set connection var ansible_timeout to 10 16380 1727204167.14136: variable 'ansible_shell_executable' from source: unknown 16380 1727204167.14139: variable 'ansible_connection' from source: unknown 16380 1727204167.14143: variable 'ansible_module_compression' from source: unknown 16380 1727204167.14145: variable 'ansible_shell_type' from source: unknown 16380 1727204167.14150: variable 'ansible_shell_executable' from source: unknown 16380 1727204167.14154: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204167.14159: variable 'ansible_pipelining' from source: unknown 16380 1727204167.14162: variable 'ansible_timeout' from source: unknown 16380 1727204167.14168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204167.14343: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204167.14355: variable 'omit' from source: magic vars 16380 1727204167.14360: starting attempt loop 16380 1727204167.14363: running the handler 16380 1727204167.14378: _low_level_execute_command(): starting 16380 1727204167.14385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204167.14942: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204167.14946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.14949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204167.14952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.15000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204167.15026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204167.15064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204167.16867: stdout chunk (state=3): >>>/root <<< 16380 1727204167.16984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204167.17057: stderr chunk (state=3): >>><<< 16380 1727204167.17060: stdout chunk (state=3): >>><<< 16380 1727204167.17075: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204167.17117: _low_level_execute_command(): starting 16380 1727204167.17121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424 `" && echo ansible-tmp-1727204167.170851-18357-202922505234424="` echo /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424 `" ) && sleep 0' 16380 1727204167.17858: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.17873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204167.17894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204167.17956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204167.20059: stdout chunk (state=3): >>>ansible-tmp-1727204167.170851-18357-202922505234424=/root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424 <<< 16380 1727204167.20176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204167.20290: stderr chunk (state=3): >>><<< 16380 1727204167.20294: stdout chunk (state=3): >>><<< 16380 1727204167.20314: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204167.170851-18357-202922505234424=/root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204167.20377: variable 'ansible_module_compression' from source: unknown 16380 1727204167.20428: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 16380 1727204167.20495: variable 'ansible_facts' from source: unknown 16380 1727204167.20567: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py 16380 1727204167.20746: Sending initial data 16380 1727204167.20750: Sent initial data (161 bytes) 16380 1727204167.21286: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204167.21291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204167.21294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204167.21297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204167.21302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.21344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204167.21360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204167.21405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204167.23130: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204167.23175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204167.23210: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpkinq5boa /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py <<< 16380 1727204167.23217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py" <<< 16380 1727204167.23246: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpkinq5boa" to remote "/root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py" <<< 16380 1727204167.24170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204167.24276: stderr chunk (state=3): >>><<< 16380 1727204167.24279: stdout chunk (state=3): >>><<< 16380 1727204167.24315: done transferring module to remote 16380 1727204167.24326: _low_level_execute_command(): starting 16380 1727204167.24331: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/ /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py && sleep 0' 16380 1727204167.24981: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204167.24986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204167.24988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204167.24993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.25100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204167.25131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204167.27044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204167.27115: stderr chunk (state=3): >>><<< 16380 1727204167.27128: stdout chunk (state=3): >>><<< 16380 1727204167.27154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204167.27158: _low_level_execute_command(): starting 16380 1727204167.27235: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/AnsiballZ_service_facts.py && sleep 0' 16380 1727204167.27855: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204167.27859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204167.27862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.27886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204167.27976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204167.27982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204167.27985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204167.28047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204170.34382: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 16380 1727204170.34438: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 16380 1727204170.34472: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16380 1727204170.36197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204170.36219: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 16380 1727204170.36238: stderr chunk (state=3): >>><<< 16380 1727204170.36247: stdout chunk (state=3): >>><<< 16380 1727204170.36282: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204170.37815: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204170.37839: _low_level_execute_command(): starting 16380 1727204170.37895: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204167.170851-18357-202922505234424/ > /dev/null 2>&1 && sleep 0' 16380 1727204170.38579: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204170.38607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204170.38624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204170.38655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204170.38676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204170.38692: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204170.38773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204170.38815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204170.38833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204170.38856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204170.38938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204170.41009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204170.41021: stdout chunk (state=3): >>><<< 16380 1727204170.41033: stderr chunk (state=3): >>><<< 16380 1727204170.41055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204170.41069: handler run complete 16380 1727204170.41369: variable 'ansible_facts' from source: unknown 16380 1727204170.41795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204170.42388: variable 'ansible_facts' from source: unknown 16380 1727204170.42607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204170.42958: attempt loop complete, returning result 16380 1727204170.42972: _execute() done 16380 1727204170.42981: dumping result to json 16380 1727204170.43066: done dumping result, returning 16380 1727204170.43082: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-749c-b6eb-0000000002fb] 16380 1727204170.43097: sending task result for task 12b410aa-8751-749c-b6eb-0000000002fb 16380 1727204170.44694: done sending task result for task 12b410aa-8751-749c-b6eb-0000000002fb 16380 1727204170.44699: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204170.44832: no more pending results, returning what we have 16380 1727204170.44835: results queue empty 16380 1727204170.44836: checking for any_errors_fatal 16380 1727204170.44840: done checking for any_errors_fatal 16380 1727204170.44841: checking for max_fail_percentage 16380 1727204170.44843: done checking for max_fail_percentage 16380 1727204170.44844: checking to see if all hosts have failed and the running result is not ok 16380 1727204170.44845: done checking to see if all hosts have failed 16380 1727204170.44846: getting the remaining hosts for this loop 16380 1727204170.44847: done getting the remaining hosts for this loop 16380 1727204170.44851: getting the next task for host managed-node2 16380 1727204170.44857: done getting next task for host managed-node2 16380 1727204170.44861: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16380 1727204170.44864: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204170.44876: getting variables 16380 1727204170.44878: in VariableManager get_vars() 16380 1727204170.44921: Calling all_inventory to load vars for managed-node2 16380 1727204170.44928: Calling groups_inventory to load vars for managed-node2 16380 1727204170.44932: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204170.44942: Calling all_plugins_play to load vars for managed-node2 16380 1727204170.44953: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204170.44957: Calling groups_plugins_play to load vars for managed-node2 16380 1727204170.47325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204170.50263: done with get_vars() 16380 1727204170.50302: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:10 -0400 (0:00:03.378) 0:00:31.610 ***** 16380 1727204170.50418: entering _queue_task() for managed-node2/package_facts 16380 1727204170.50772: worker is 1 (out of 1 available) 16380 1727204170.50787: exiting _queue_task() for managed-node2/package_facts 16380 1727204170.51003: done queuing things up, now waiting for results queue to drain 16380 1727204170.51006: waiting for pending results... 16380 1727204170.51136: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16380 1727204170.51266: in run() - task 12b410aa-8751-749c-b6eb-0000000002fc 16380 1727204170.51288: variable 'ansible_search_path' from source: unknown 16380 1727204170.51299: variable 'ansible_search_path' from source: unknown 16380 1727204170.51395: calling self._execute() 16380 1727204170.51454: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204170.51468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204170.51484: variable 'omit' from source: magic vars 16380 1727204170.51932: variable 'ansible_distribution_major_version' from source: facts 16380 1727204170.51951: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204170.51962: variable 'omit' from source: magic vars 16380 1727204170.52038: variable 'omit' from source: magic vars 16380 1727204170.52080: variable 'omit' from source: magic vars 16380 1727204170.52136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204170.52216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204170.52222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204170.52250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204170.52268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204170.52324: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204170.52328: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204170.52433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204170.52474: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204170.52492: Set connection var ansible_shell_executable to /bin/sh 16380 1727204170.52507: Set connection var ansible_connection to ssh 16380 1727204170.52520: Set connection var ansible_shell_type to sh 16380 1727204170.52532: Set connection var ansible_pipelining to False 16380 1727204170.52554: Set connection var ansible_timeout to 10 16380 1727204170.52587: variable 'ansible_shell_executable' from source: unknown 16380 1727204170.52600: variable 'ansible_connection' from source: unknown 16380 1727204170.52609: variable 'ansible_module_compression' from source: unknown 16380 1727204170.52617: variable 'ansible_shell_type' from source: unknown 16380 1727204170.52626: variable 'ansible_shell_executable' from source: unknown 16380 1727204170.52634: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204170.52646: variable 'ansible_pipelining' from source: unknown 16380 1727204170.52658: variable 'ansible_timeout' from source: unknown 16380 1727204170.52668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204170.52917: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204170.52937: variable 'omit' from source: magic vars 16380 1727204170.52948: starting attempt loop 16380 1727204170.52956: running the handler 16380 1727204170.53138: _low_level_execute_command(): starting 16380 1727204170.53143: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204170.53855: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204170.53874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204170.53894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204170.53918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204170.53940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204170.54049: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204170.54066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204170.54083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204170.54280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204170.56085: stdout chunk (state=3): >>>/root <<< 16380 1727204170.56195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204170.56268: stderr chunk (state=3): >>><<< 16380 1727204170.56279: stdout chunk (state=3): >>><<< 16380 1727204170.56312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204170.56334: _low_level_execute_command(): starting 16380 1727204170.56428: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380 `" && echo ansible-tmp-1727204170.5631974-18563-47476276752380="` echo /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380 `" ) && sleep 0' 16380 1727204170.56974: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204170.57002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204170.57036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204170.57153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204170.57381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204170.57608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204170.59526: stdout chunk (state=3): >>>ansible-tmp-1727204170.5631974-18563-47476276752380=/root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380 <<< 16380 1727204170.59649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204170.59717: stderr chunk (state=3): >>><<< 16380 1727204170.59728: stdout chunk (state=3): >>><<< 16380 1727204170.59751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204170.5631974-18563-47476276752380=/root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204170.59814: variable 'ansible_module_compression' from source: unknown 16380 1727204170.59869: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 16380 1727204170.59944: variable 'ansible_facts' from source: unknown 16380 1727204170.60161: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py 16380 1727204170.60427: Sending initial data 16380 1727204170.60430: Sent initial data (161 bytes) 16380 1727204170.61008: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204170.61058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204170.61082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204170.61102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204170.61219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204170.62960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204170.63002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204170.63065: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp8rqxv0iw /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py <<< 16380 1727204170.63068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py" <<< 16380 1727204170.63133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp8rqxv0iw" to remote "/root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py" <<< 16380 1727204170.68001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204170.68456: stderr chunk (state=3): >>><<< 16380 1727204170.68460: stdout chunk (state=3): >>><<< 16380 1727204170.68462: done transferring module to remote 16380 1727204170.68465: _low_level_execute_command(): starting 16380 1727204170.68467: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/ /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py && sleep 0' 16380 1727204170.69642: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204170.69660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204170.69664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204170.69729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204170.69745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204170.69770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204170.69929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204170.71884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204170.71955: stderr chunk (state=3): >>><<< 16380 1727204170.72126: stdout chunk (state=3): >>><<< 16380 1727204170.72131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204170.72140: _low_level_execute_command(): starting 16380 1727204170.72144: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/AnsiballZ_package_facts.py && sleep 0' 16380 1727204170.73395: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204170.73526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204170.73676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204170.73680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204171.38409: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 16380 1727204171.38536: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 16380 1727204171.38663: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16380 1727204171.40677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204171.40866: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 16380 1727204171.40891: stderr chunk (state=3): >>><<< 16380 1727204171.40933: stdout chunk (state=3): >>><<< 16380 1727204171.40974: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204171.51498: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204171.51503: _low_level_execute_command(): starting 16380 1727204171.51506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204170.5631974-18563-47476276752380/ > /dev/null 2>&1 && sleep 0' 16380 1727204171.52140: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204171.52185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204171.52191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204171.52194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204171.52196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204171.52199: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204171.52219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204171.52229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204171.52257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204171.52261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204171.52325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204171.52333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204171.52436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204171.54421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204171.54484: stderr chunk (state=3): >>><<< 16380 1727204171.54488: stdout chunk (state=3): >>><<< 16380 1727204171.54503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204171.54512: handler run complete 16380 1727204171.55449: variable 'ansible_facts' from source: unknown 16380 1727204171.56167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204171.58475: variable 'ansible_facts' from source: unknown 16380 1727204171.58897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204171.59679: attempt loop complete, returning result 16380 1727204171.59696: _execute() done 16380 1727204171.59699: dumping result to json 16380 1727204171.59883: done dumping result, returning 16380 1727204171.59892: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-749c-b6eb-0000000002fc] 16380 1727204171.59898: sending task result for task 12b410aa-8751-749c-b6eb-0000000002fc ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204171.66961: done sending task result for task 12b410aa-8751-749c-b6eb-0000000002fc 16380 1727204171.66965: WORKER PROCESS EXITING 16380 1727204171.66971: no more pending results, returning what we have 16380 1727204171.66973: results queue empty 16380 1727204171.66973: checking for any_errors_fatal 16380 1727204171.66976: done checking for any_errors_fatal 16380 1727204171.66977: checking for max_fail_percentage 16380 1727204171.66978: done checking for max_fail_percentage 16380 1727204171.66978: checking to see if all hosts have failed and the running result is not ok 16380 1727204171.66979: done checking to see if all hosts have failed 16380 1727204171.66979: getting the remaining hosts for this loop 16380 1727204171.66980: done getting the remaining hosts for this loop 16380 1727204171.66983: getting the next task for host managed-node2 16380 1727204171.66986: done getting next task for host managed-node2 16380 1727204171.66990: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16380 1727204171.66992: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204171.66998: getting variables 16380 1727204171.66999: in VariableManager get_vars() 16380 1727204171.67018: Calling all_inventory to load vars for managed-node2 16380 1727204171.67020: Calling groups_inventory to load vars for managed-node2 16380 1727204171.67022: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204171.67027: Calling all_plugins_play to load vars for managed-node2 16380 1727204171.67029: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204171.67031: Calling groups_plugins_play to load vars for managed-node2 16380 1727204171.69057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204171.71963: done with get_vars() 16380 1727204171.72000: done getting variables 16380 1727204171.72059: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:11 -0400 (0:00:01.216) 0:00:32.827 ***** 16380 1727204171.72092: entering _queue_task() for managed-node2/debug 16380 1727204171.72452: worker is 1 (out of 1 available) 16380 1727204171.72465: exiting _queue_task() for managed-node2/debug 16380 1727204171.72479: done queuing things up, now waiting for results queue to drain 16380 1727204171.72481: waiting for pending results... 16380 1727204171.72909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16380 1727204171.72915: in run() - task 12b410aa-8751-749c-b6eb-00000000003b 16380 1727204171.72935: variable 'ansible_search_path' from source: unknown 16380 1727204171.72943: variable 'ansible_search_path' from source: unknown 16380 1727204171.72986: calling self._execute() 16380 1727204171.73107: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204171.73122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204171.73143: variable 'omit' from source: magic vars 16380 1727204171.73611: variable 'ansible_distribution_major_version' from source: facts 16380 1727204171.73631: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204171.73643: variable 'omit' from source: magic vars 16380 1727204171.73701: variable 'omit' from source: magic vars 16380 1727204171.73827: variable 'network_provider' from source: set_fact 16380 1727204171.73897: variable 'omit' from source: magic vars 16380 1727204171.73909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204171.73957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204171.73994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204171.74025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204171.74043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204171.74080: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204171.74115: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204171.74118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204171.74216: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204171.74236: Set connection var ansible_shell_executable to /bin/sh 16380 1727204171.74335: Set connection var ansible_connection to ssh 16380 1727204171.74339: Set connection var ansible_shell_type to sh 16380 1727204171.74342: Set connection var ansible_pipelining to False 16380 1727204171.74345: Set connection var ansible_timeout to 10 16380 1727204171.74348: variable 'ansible_shell_executable' from source: unknown 16380 1727204171.74351: variable 'ansible_connection' from source: unknown 16380 1727204171.74353: variable 'ansible_module_compression' from source: unknown 16380 1727204171.74355: variable 'ansible_shell_type' from source: unknown 16380 1727204171.74357: variable 'ansible_shell_executable' from source: unknown 16380 1727204171.74359: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204171.74362: variable 'ansible_pipelining' from source: unknown 16380 1727204171.74364: variable 'ansible_timeout' from source: unknown 16380 1727204171.74366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204171.74549: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204171.74665: variable 'omit' from source: magic vars 16380 1727204171.74669: starting attempt loop 16380 1727204171.74671: running the handler 16380 1727204171.74674: handler run complete 16380 1727204171.74677: attempt loop complete, returning result 16380 1727204171.74679: _execute() done 16380 1727204171.74684: dumping result to json 16380 1727204171.74696: done dumping result, returning 16380 1727204171.74710: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-749c-b6eb-00000000003b] 16380 1727204171.74721: sending task result for task 12b410aa-8751-749c-b6eb-00000000003b ok: [managed-node2] => {} MSG: Using network provider: nm 16380 1727204171.74901: no more pending results, returning what we have 16380 1727204171.74905: results queue empty 16380 1727204171.74906: checking for any_errors_fatal 16380 1727204171.74921: done checking for any_errors_fatal 16380 1727204171.74922: checking for max_fail_percentage 16380 1727204171.74924: done checking for max_fail_percentage 16380 1727204171.74926: checking to see if all hosts have failed and the running result is not ok 16380 1727204171.74927: done checking to see if all hosts have failed 16380 1727204171.74928: getting the remaining hosts for this loop 16380 1727204171.74930: done getting the remaining hosts for this loop 16380 1727204171.74936: getting the next task for host managed-node2 16380 1727204171.74945: done getting next task for host managed-node2 16380 1727204171.74950: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16380 1727204171.74952: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204171.74966: getting variables 16380 1727204171.74968: in VariableManager get_vars() 16380 1727204171.75116: Calling all_inventory to load vars for managed-node2 16380 1727204171.75120: Calling groups_inventory to load vars for managed-node2 16380 1727204171.75123: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204171.75136: Calling all_plugins_play to load vars for managed-node2 16380 1727204171.75140: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204171.75144: Calling groups_plugins_play to load vars for managed-node2 16380 1727204171.75972: done sending task result for task 12b410aa-8751-749c-b6eb-00000000003b 16380 1727204171.75975: WORKER PROCESS EXITING 16380 1727204171.77601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204171.80494: done with get_vars() 16380 1727204171.80538: done getting variables 16380 1727204171.80614: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.085) 0:00:32.913 ***** 16380 1727204171.80654: entering _queue_task() for managed-node2/fail 16380 1727204171.81032: worker is 1 (out of 1 available) 16380 1727204171.81044: exiting _queue_task() for managed-node2/fail 16380 1727204171.81059: done queuing things up, now waiting for results queue to drain 16380 1727204171.81062: waiting for pending results... 16380 1727204171.81375: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16380 1727204171.81523: in run() - task 12b410aa-8751-749c-b6eb-00000000003c 16380 1727204171.81549: variable 'ansible_search_path' from source: unknown 16380 1727204171.81559: variable 'ansible_search_path' from source: unknown 16380 1727204171.81607: calling self._execute() 16380 1727204171.81719: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204171.81738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204171.81757: variable 'omit' from source: magic vars 16380 1727204171.82221: variable 'ansible_distribution_major_version' from source: facts 16380 1727204171.82241: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204171.82413: variable 'network_state' from source: role '' defaults 16380 1727204171.82431: Evaluated conditional (network_state != {}): False 16380 1727204171.82439: when evaluation is False, skipping this task 16380 1727204171.82448: _execute() done 16380 1727204171.82457: dumping result to json 16380 1727204171.82467: done dumping result, returning 16380 1727204171.82482: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-749c-b6eb-00000000003c] 16380 1727204171.82500: sending task result for task 12b410aa-8751-749c-b6eb-00000000003c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204171.82745: no more pending results, returning what we have 16380 1727204171.82751: results queue empty 16380 1727204171.82753: checking for any_errors_fatal 16380 1727204171.82761: done checking for any_errors_fatal 16380 1727204171.82762: checking for max_fail_percentage 16380 1727204171.82764: done checking for max_fail_percentage 16380 1727204171.82766: checking to see if all hosts have failed and the running result is not ok 16380 1727204171.82767: done checking to see if all hosts have failed 16380 1727204171.82768: getting the remaining hosts for this loop 16380 1727204171.82770: done getting the remaining hosts for this loop 16380 1727204171.82774: getting the next task for host managed-node2 16380 1727204171.82783: done getting next task for host managed-node2 16380 1727204171.82788: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16380 1727204171.82794: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204171.82817: getting variables 16380 1727204171.82820: in VariableManager get_vars() 16380 1727204171.82865: Calling all_inventory to load vars for managed-node2 16380 1727204171.82869: Calling groups_inventory to load vars for managed-node2 16380 1727204171.82872: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204171.82888: Calling all_plugins_play to load vars for managed-node2 16380 1727204171.83097: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204171.83103: Calling groups_plugins_play to load vars for managed-node2 16380 1727204171.83806: done sending task result for task 12b410aa-8751-749c-b6eb-00000000003c 16380 1727204171.83809: WORKER PROCESS EXITING 16380 1727204171.85373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204171.88333: done with get_vars() 16380 1727204171.88374: done getting variables 16380 1727204171.88448: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.078) 0:00:32.991 ***** 16380 1727204171.88484: entering _queue_task() for managed-node2/fail 16380 1727204171.88849: worker is 1 (out of 1 available) 16380 1727204171.88861: exiting _queue_task() for managed-node2/fail 16380 1727204171.88876: done queuing things up, now waiting for results queue to drain 16380 1727204171.88878: waiting for pending results... 16380 1727204171.89183: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16380 1727204171.89323: in run() - task 12b410aa-8751-749c-b6eb-00000000003d 16380 1727204171.89346: variable 'ansible_search_path' from source: unknown 16380 1727204171.89356: variable 'ansible_search_path' from source: unknown 16380 1727204171.89405: calling self._execute() 16380 1727204171.89518: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204171.89538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204171.89555: variable 'omit' from source: magic vars 16380 1727204171.90019: variable 'ansible_distribution_major_version' from source: facts 16380 1727204171.90044: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204171.90209: variable 'network_state' from source: role '' defaults 16380 1727204171.90229: Evaluated conditional (network_state != {}): False 16380 1727204171.90240: when evaluation is False, skipping this task 16380 1727204171.90251: _execute() done 16380 1727204171.90260: dumping result to json 16380 1727204171.90269: done dumping result, returning 16380 1727204171.90281: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-749c-b6eb-00000000003d] 16380 1727204171.90294: sending task result for task 12b410aa-8751-749c-b6eb-00000000003d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204171.90608: no more pending results, returning what we have 16380 1727204171.90612: results queue empty 16380 1727204171.90614: checking for any_errors_fatal 16380 1727204171.90626: done checking for any_errors_fatal 16380 1727204171.90627: checking for max_fail_percentage 16380 1727204171.90629: done checking for max_fail_percentage 16380 1727204171.90630: checking to see if all hosts have failed and the running result is not ok 16380 1727204171.90631: done checking to see if all hosts have failed 16380 1727204171.90633: getting the remaining hosts for this loop 16380 1727204171.90635: done getting the remaining hosts for this loop 16380 1727204171.90640: getting the next task for host managed-node2 16380 1727204171.90647: done getting next task for host managed-node2 16380 1727204171.90652: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16380 1727204171.90655: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204171.90675: getting variables 16380 1727204171.90677: in VariableManager get_vars() 16380 1727204171.90726: Calling all_inventory to load vars for managed-node2 16380 1727204171.90730: Calling groups_inventory to load vars for managed-node2 16380 1727204171.90733: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204171.90750: Calling all_plugins_play to load vars for managed-node2 16380 1727204171.90754: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204171.90758: Calling groups_plugins_play to load vars for managed-node2 16380 1727204171.91307: done sending task result for task 12b410aa-8751-749c-b6eb-00000000003d 16380 1727204171.91310: WORKER PROCESS EXITING 16380 1727204171.93093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204171.95974: done with get_vars() 16380 1727204171.96019: done getting variables 16380 1727204171.96098: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.076) 0:00:33.067 ***** 16380 1727204171.96135: entering _queue_task() for managed-node2/fail 16380 1727204171.96509: worker is 1 (out of 1 available) 16380 1727204171.96524: exiting _queue_task() for managed-node2/fail 16380 1727204171.96539: done queuing things up, now waiting for results queue to drain 16380 1727204171.96541: waiting for pending results... 16380 1727204171.96847: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16380 1727204171.96981: in run() - task 12b410aa-8751-749c-b6eb-00000000003e 16380 1727204171.97006: variable 'ansible_search_path' from source: unknown 16380 1727204171.97019: variable 'ansible_search_path' from source: unknown 16380 1727204171.97064: calling self._execute() 16380 1727204171.97181: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204171.97197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204171.97213: variable 'omit' from source: magic vars 16380 1727204171.97675: variable 'ansible_distribution_major_version' from source: facts 16380 1727204171.97696: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204171.97933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.00780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.00875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.00929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.00981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.01023: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.01130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.01178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.01220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.01372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.01376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.01426: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.01451: Evaluated conditional (ansible_distribution_major_version | int > 9): True 16380 1727204172.01619: variable 'ansible_distribution' from source: facts 16380 1727204172.01631: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.01646: Evaluated conditional (ansible_distribution in __network_rh_distros): False 16380 1727204172.01654: when evaluation is False, skipping this task 16380 1727204172.01662: _execute() done 16380 1727204172.01669: dumping result to json 16380 1727204172.01679: done dumping result, returning 16380 1727204172.01697: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-749c-b6eb-00000000003e] 16380 1727204172.01708: sending task result for task 12b410aa-8751-749c-b6eb-00000000003e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 16380 1727204172.01972: no more pending results, returning what we have 16380 1727204172.01976: results queue empty 16380 1727204172.01977: checking for any_errors_fatal 16380 1727204172.01985: done checking for any_errors_fatal 16380 1727204172.01987: checking for max_fail_percentage 16380 1727204172.01991: done checking for max_fail_percentage 16380 1727204172.01992: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.01993: done checking to see if all hosts have failed 16380 1727204172.01994: getting the remaining hosts for this loop 16380 1727204172.01996: done getting the remaining hosts for this loop 16380 1727204172.02001: getting the next task for host managed-node2 16380 1727204172.02009: done getting next task for host managed-node2 16380 1727204172.02014: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16380 1727204172.02017: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.02034: getting variables 16380 1727204172.02036: in VariableManager get_vars() 16380 1727204172.02081: Calling all_inventory to load vars for managed-node2 16380 1727204172.02085: Calling groups_inventory to load vars for managed-node2 16380 1727204172.02088: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.02304: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.02308: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.02312: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.03100: done sending task result for task 12b410aa-8751-749c-b6eb-00000000003e 16380 1727204172.03103: WORKER PROCESS EXITING 16380 1727204172.04642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.07646: done with get_vars() 16380 1727204172.07687: done getting variables 16380 1727204172.07760: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.116) 0:00:33.184 ***** 16380 1727204172.07798: entering _queue_task() for managed-node2/dnf 16380 1727204172.08178: worker is 1 (out of 1 available) 16380 1727204172.08395: exiting _queue_task() for managed-node2/dnf 16380 1727204172.08408: done queuing things up, now waiting for results queue to drain 16380 1727204172.08410: waiting for pending results... 16380 1727204172.08516: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16380 1727204172.08653: in run() - task 12b410aa-8751-749c-b6eb-00000000003f 16380 1727204172.08677: variable 'ansible_search_path' from source: unknown 16380 1727204172.08685: variable 'ansible_search_path' from source: unknown 16380 1727204172.08731: calling self._execute() 16380 1727204172.08848: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.08864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.08880: variable 'omit' from source: magic vars 16380 1727204172.09342: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.09360: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.09601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.12294: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.12386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.12435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.12485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.12522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.12635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.12678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.12715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.12777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.12800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.12956: variable 'ansible_distribution' from source: facts 16380 1727204172.12967: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.12993: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16380 1727204172.13133: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.13494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.13498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.13501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.13503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.13506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.13520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.13556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.13595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.13655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.13677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.13738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.13772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.13810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.13869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.13896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.14117: variable 'network_connections' from source: play vars 16380 1727204172.14137: variable 'profile' from source: play vars 16380 1727204172.14221: variable 'profile' from source: play vars 16380 1727204172.14233: variable 'interface' from source: set_fact 16380 1727204172.14317: variable 'interface' from source: set_fact 16380 1727204172.14422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204172.14642: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204172.14697: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204172.14766: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204172.14924: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204172.14927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204172.14930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204172.14943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.14980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204172.15046: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204172.15575: variable 'network_connections' from source: play vars 16380 1727204172.15588: variable 'profile' from source: play vars 16380 1727204172.15667: variable 'profile' from source: play vars 16380 1727204172.15690: variable 'interface' from source: set_fact 16380 1727204172.15758: variable 'interface' from source: set_fact 16380 1727204172.15907: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204172.15911: when evaluation is False, skipping this task 16380 1727204172.15913: _execute() done 16380 1727204172.15916: dumping result to json 16380 1727204172.15918: done dumping result, returning 16380 1727204172.15921: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-00000000003f] 16380 1727204172.15923: sending task result for task 12b410aa-8751-749c-b6eb-00000000003f 16380 1727204172.16001: done sending task result for task 12b410aa-8751-749c-b6eb-00000000003f 16380 1727204172.16004: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204172.16067: no more pending results, returning what we have 16380 1727204172.16071: results queue empty 16380 1727204172.16072: checking for any_errors_fatal 16380 1727204172.16080: done checking for any_errors_fatal 16380 1727204172.16082: checking for max_fail_percentage 16380 1727204172.16084: done checking for max_fail_percentage 16380 1727204172.16085: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.16086: done checking to see if all hosts have failed 16380 1727204172.16087: getting the remaining hosts for this loop 16380 1727204172.16091: done getting the remaining hosts for this loop 16380 1727204172.16095: getting the next task for host managed-node2 16380 1727204172.16103: done getting next task for host managed-node2 16380 1727204172.16109: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16380 1727204172.16111: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.16128: getting variables 16380 1727204172.16130: in VariableManager get_vars() 16380 1727204172.16177: Calling all_inventory to load vars for managed-node2 16380 1727204172.16181: Calling groups_inventory to load vars for managed-node2 16380 1727204172.16184: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.16500: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.16504: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.16509: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.18823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.21804: done with get_vars() 16380 1727204172.21849: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16380 1727204172.21945: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.141) 0:00:33.326 ***** 16380 1727204172.21983: entering _queue_task() for managed-node2/yum 16380 1727204172.26837: worker is 1 (out of 1 available) 16380 1727204172.26854: exiting _queue_task() for managed-node2/yum 16380 1727204172.26868: done queuing things up, now waiting for results queue to drain 16380 1727204172.26870: waiting for pending results... 16380 1727204172.27073: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16380 1727204172.27166: in run() - task 12b410aa-8751-749c-b6eb-000000000040 16380 1727204172.27180: variable 'ansible_search_path' from source: unknown 16380 1727204172.27184: variable 'ansible_search_path' from source: unknown 16380 1727204172.27226: calling self._execute() 16380 1727204172.27310: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.27314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.27331: variable 'omit' from source: magic vars 16380 1727204172.27664: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.27675: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.27832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.29985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.30046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.30076: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.30109: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.30136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.30207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.30244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.30269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.30303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.30316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.30405: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.30418: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16380 1727204172.30424: when evaluation is False, skipping this task 16380 1727204172.30428: _execute() done 16380 1727204172.30433: dumping result to json 16380 1727204172.30438: done dumping result, returning 16380 1727204172.30446: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000040] 16380 1727204172.30452: sending task result for task 12b410aa-8751-749c-b6eb-000000000040 16380 1727204172.30550: done sending task result for task 12b410aa-8751-749c-b6eb-000000000040 16380 1727204172.30553: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16380 1727204172.30629: no more pending results, returning what we have 16380 1727204172.30632: results queue empty 16380 1727204172.30633: checking for any_errors_fatal 16380 1727204172.30642: done checking for any_errors_fatal 16380 1727204172.30643: checking for max_fail_percentage 16380 1727204172.30645: done checking for max_fail_percentage 16380 1727204172.30646: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.30647: done checking to see if all hosts have failed 16380 1727204172.30648: getting the remaining hosts for this loop 16380 1727204172.30650: done getting the remaining hosts for this loop 16380 1727204172.30654: getting the next task for host managed-node2 16380 1727204172.30660: done getting next task for host managed-node2 16380 1727204172.30665: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16380 1727204172.30667: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.30684: getting variables 16380 1727204172.30686: in VariableManager get_vars() 16380 1727204172.30726: Calling all_inventory to load vars for managed-node2 16380 1727204172.30730: Calling groups_inventory to load vars for managed-node2 16380 1727204172.30733: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.30743: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.30746: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.30750: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.33557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.37964: done with get_vars() 16380 1727204172.38019: done getting variables 16380 1727204172.38106: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.161) 0:00:33.488 ***** 16380 1727204172.38150: entering _queue_task() for managed-node2/fail 16380 1727204172.38627: worker is 1 (out of 1 available) 16380 1727204172.38641: exiting _queue_task() for managed-node2/fail 16380 1727204172.38654: done queuing things up, now waiting for results queue to drain 16380 1727204172.38656: waiting for pending results... 16380 1727204172.38802: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16380 1727204172.38887: in run() - task 12b410aa-8751-749c-b6eb-000000000041 16380 1727204172.38902: variable 'ansible_search_path' from source: unknown 16380 1727204172.38907: variable 'ansible_search_path' from source: unknown 16380 1727204172.38945: calling self._execute() 16380 1727204172.39031: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.39037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.39050: variable 'omit' from source: magic vars 16380 1727204172.39384: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.39396: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.39503: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.39682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.42143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.42198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.42236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.42265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.42295: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.42371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.42397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.42423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.42458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.42471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.42513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.42541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.42559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.42591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.42603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.42647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.42669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.42690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.42723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.42737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.42884: variable 'network_connections' from source: play vars 16380 1727204172.42899: variable 'profile' from source: play vars 16380 1727204172.42960: variable 'profile' from source: play vars 16380 1727204172.42963: variable 'interface' from source: set_fact 16380 1727204172.43018: variable 'interface' from source: set_fact 16380 1727204172.43099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204172.43255: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204172.43288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204172.43319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204172.43346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204172.43384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204172.43408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204172.43432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.43462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204172.43513: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204172.43728: variable 'network_connections' from source: play vars 16380 1727204172.43739: variable 'profile' from source: play vars 16380 1727204172.43791: variable 'profile' from source: play vars 16380 1727204172.43799: variable 'interface' from source: set_fact 16380 1727204172.43855: variable 'interface' from source: set_fact 16380 1727204172.43995: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204172.43998: when evaluation is False, skipping this task 16380 1727204172.44000: _execute() done 16380 1727204172.44003: dumping result to json 16380 1727204172.44005: done dumping result, returning 16380 1727204172.44006: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000041] 16380 1727204172.44022: sending task result for task 12b410aa-8751-749c-b6eb-000000000041 16380 1727204172.44108: done sending task result for task 12b410aa-8751-749c-b6eb-000000000041 16380 1727204172.44111: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204172.44319: no more pending results, returning what we have 16380 1727204172.44323: results queue empty 16380 1727204172.44324: checking for any_errors_fatal 16380 1727204172.44333: done checking for any_errors_fatal 16380 1727204172.44334: checking for max_fail_percentage 16380 1727204172.44336: done checking for max_fail_percentage 16380 1727204172.44337: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.44338: done checking to see if all hosts have failed 16380 1727204172.44339: getting the remaining hosts for this loop 16380 1727204172.44341: done getting the remaining hosts for this loop 16380 1727204172.44345: getting the next task for host managed-node2 16380 1727204172.44357: done getting next task for host managed-node2 16380 1727204172.44361: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16380 1727204172.44363: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.44382: getting variables 16380 1727204172.44388: in VariableManager get_vars() 16380 1727204172.44441: Calling all_inventory to load vars for managed-node2 16380 1727204172.44445: Calling groups_inventory to load vars for managed-node2 16380 1727204172.44448: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.44460: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.44464: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.44468: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.46563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.48932: done with get_vars() 16380 1727204172.48960: done getting variables 16380 1727204172.49046: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.109) 0:00:33.597 ***** 16380 1727204172.49075: entering _queue_task() for managed-node2/package 16380 1727204172.49418: worker is 1 (out of 1 available) 16380 1727204172.49433: exiting _queue_task() for managed-node2/package 16380 1727204172.49446: done queuing things up, now waiting for results queue to drain 16380 1727204172.49449: waiting for pending results... 16380 1727204172.49660: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16380 1727204172.49751: in run() - task 12b410aa-8751-749c-b6eb-000000000042 16380 1727204172.49764: variable 'ansible_search_path' from source: unknown 16380 1727204172.49768: variable 'ansible_search_path' from source: unknown 16380 1727204172.49807: calling self._execute() 16380 1727204172.49901: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.49905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.49919: variable 'omit' from source: magic vars 16380 1727204172.50261: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.50273: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.50452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204172.50681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204172.50726: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204172.50757: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204172.50818: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204172.50932: variable 'network_packages' from source: role '' defaults 16380 1727204172.51025: variable '__network_provider_setup' from source: role '' defaults 16380 1727204172.51038: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204172.51092: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204172.51100: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204172.51157: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204172.51316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.53457: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.53501: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.53550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.53616: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.53642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.53755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.53782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.53816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.53849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.53865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.53919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.53954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.53992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.54080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.54087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.54324: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16380 1727204172.54513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.54556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.54586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.54650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.54661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.54745: variable 'ansible_python' from source: facts 16380 1727204172.54771: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16380 1727204172.54943: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204172.55016: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204172.55347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.55380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.55404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.55438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.55452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.55500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.55535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.55557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.55591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.55604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.55731: variable 'network_connections' from source: play vars 16380 1727204172.55736: variable 'profile' from source: play vars 16380 1727204172.55826: variable 'profile' from source: play vars 16380 1727204172.55833: variable 'interface' from source: set_fact 16380 1727204172.55898: variable 'interface' from source: set_fact 16380 1727204172.55960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204172.55984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204172.56012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.56039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204172.56080: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.56334: variable 'network_connections' from source: play vars 16380 1727204172.56338: variable 'profile' from source: play vars 16380 1727204172.56423: variable 'profile' from source: play vars 16380 1727204172.56431: variable 'interface' from source: set_fact 16380 1727204172.56488: variable 'interface' from source: set_fact 16380 1727204172.56522: variable '__network_packages_default_wireless' from source: role '' defaults 16380 1727204172.56590: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.56847: variable 'network_connections' from source: play vars 16380 1727204172.56881: variable 'profile' from source: play vars 16380 1727204172.56949: variable 'profile' from source: play vars 16380 1727204172.56952: variable 'interface' from source: set_fact 16380 1727204172.57051: variable 'interface' from source: set_fact 16380 1727204172.57074: variable '__network_packages_default_team' from source: role '' defaults 16380 1727204172.57174: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204172.57487: variable 'network_connections' from source: play vars 16380 1727204172.57494: variable 'profile' from source: play vars 16380 1727204172.57548: variable 'profile' from source: play vars 16380 1727204172.57552: variable 'interface' from source: set_fact 16380 1727204172.57651: variable 'interface' from source: set_fact 16380 1727204172.57724: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204172.57774: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204172.57780: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204172.57836: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204172.58022: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16380 1727204172.58429: variable 'network_connections' from source: play vars 16380 1727204172.58435: variable 'profile' from source: play vars 16380 1727204172.58488: variable 'profile' from source: play vars 16380 1727204172.58493: variable 'interface' from source: set_fact 16380 1727204172.58546: variable 'interface' from source: set_fact 16380 1727204172.58554: variable 'ansible_distribution' from source: facts 16380 1727204172.58560: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.58566: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.58583: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16380 1727204172.58724: variable 'ansible_distribution' from source: facts 16380 1727204172.58728: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.58734: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.58741: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16380 1727204172.58909: variable 'ansible_distribution' from source: facts 16380 1727204172.58912: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.58915: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.58947: variable 'network_provider' from source: set_fact 16380 1727204172.58967: variable 'ansible_facts' from source: unknown 16380 1727204172.59808: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16380 1727204172.59812: when evaluation is False, skipping this task 16380 1727204172.59815: _execute() done 16380 1727204172.59818: dumping result to json 16380 1727204172.59822: done dumping result, returning 16380 1727204172.59831: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-749c-b6eb-000000000042] 16380 1727204172.59837: sending task result for task 12b410aa-8751-749c-b6eb-000000000042 16380 1727204172.59939: done sending task result for task 12b410aa-8751-749c-b6eb-000000000042 16380 1727204172.59942: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16380 1727204172.60003: no more pending results, returning what we have 16380 1727204172.60007: results queue empty 16380 1727204172.60008: checking for any_errors_fatal 16380 1727204172.60017: done checking for any_errors_fatal 16380 1727204172.60018: checking for max_fail_percentage 16380 1727204172.60020: done checking for max_fail_percentage 16380 1727204172.60021: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.60022: done checking to see if all hosts have failed 16380 1727204172.60023: getting the remaining hosts for this loop 16380 1727204172.60024: done getting the remaining hosts for this loop 16380 1727204172.60029: getting the next task for host managed-node2 16380 1727204172.60037: done getting next task for host managed-node2 16380 1727204172.60041: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16380 1727204172.60043: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.60059: getting variables 16380 1727204172.60063: in VariableManager get_vars() 16380 1727204172.60113: Calling all_inventory to load vars for managed-node2 16380 1727204172.60117: Calling groups_inventory to load vars for managed-node2 16380 1727204172.60120: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.60136: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.60139: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.60142: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.62216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.63816: done with get_vars() 16380 1727204172.63849: done getting variables 16380 1727204172.63910: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.148) 0:00:33.745 ***** 16380 1727204172.63939: entering _queue_task() for managed-node2/package 16380 1727204172.64229: worker is 1 (out of 1 available) 16380 1727204172.64247: exiting _queue_task() for managed-node2/package 16380 1727204172.64261: done queuing things up, now waiting for results queue to drain 16380 1727204172.64264: waiting for pending results... 16380 1727204172.64477: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16380 1727204172.64565: in run() - task 12b410aa-8751-749c-b6eb-000000000043 16380 1727204172.64579: variable 'ansible_search_path' from source: unknown 16380 1727204172.64583: variable 'ansible_search_path' from source: unknown 16380 1727204172.64624: calling self._execute() 16380 1727204172.64715: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.64725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.64735: variable 'omit' from source: magic vars 16380 1727204172.65076: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.65088: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.65199: variable 'network_state' from source: role '' defaults 16380 1727204172.65210: Evaluated conditional (network_state != {}): False 16380 1727204172.65214: when evaluation is False, skipping this task 16380 1727204172.65219: _execute() done 16380 1727204172.65222: dumping result to json 16380 1727204172.65226: done dumping result, returning 16380 1727204172.65234: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-749c-b6eb-000000000043] 16380 1727204172.65241: sending task result for task 12b410aa-8751-749c-b6eb-000000000043 16380 1727204172.65345: done sending task result for task 12b410aa-8751-749c-b6eb-000000000043 16380 1727204172.65348: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204172.65414: no more pending results, returning what we have 16380 1727204172.65420: results queue empty 16380 1727204172.65421: checking for any_errors_fatal 16380 1727204172.65433: done checking for any_errors_fatal 16380 1727204172.65434: checking for max_fail_percentage 16380 1727204172.65436: done checking for max_fail_percentage 16380 1727204172.65437: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.65438: done checking to see if all hosts have failed 16380 1727204172.65439: getting the remaining hosts for this loop 16380 1727204172.65441: done getting the remaining hosts for this loop 16380 1727204172.65446: getting the next task for host managed-node2 16380 1727204172.65452: done getting next task for host managed-node2 16380 1727204172.65458: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16380 1727204172.65460: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.65478: getting variables 16380 1727204172.65480: in VariableManager get_vars() 16380 1727204172.65524: Calling all_inventory to load vars for managed-node2 16380 1727204172.65528: Calling groups_inventory to load vars for managed-node2 16380 1727204172.65530: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.65540: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.65543: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.65547: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.66959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.68552: done with get_vars() 16380 1727204172.68580: done getting variables 16380 1727204172.68634: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.047) 0:00:33.793 ***** 16380 1727204172.68663: entering _queue_task() for managed-node2/package 16380 1727204172.68938: worker is 1 (out of 1 available) 16380 1727204172.68954: exiting _queue_task() for managed-node2/package 16380 1727204172.68968: done queuing things up, now waiting for results queue to drain 16380 1727204172.68971: waiting for pending results... 16380 1727204172.69165: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16380 1727204172.69251: in run() - task 12b410aa-8751-749c-b6eb-000000000044 16380 1727204172.69265: variable 'ansible_search_path' from source: unknown 16380 1727204172.69269: variable 'ansible_search_path' from source: unknown 16380 1727204172.69304: calling self._execute() 16380 1727204172.69411: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.69417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.69420: variable 'omit' from source: magic vars 16380 1727204172.69890: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.69894: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.69963: variable 'network_state' from source: role '' defaults 16380 1727204172.69967: Evaluated conditional (network_state != {}): False 16380 1727204172.69970: when evaluation is False, skipping this task 16380 1727204172.69973: _execute() done 16380 1727204172.69975: dumping result to json 16380 1727204172.69978: done dumping result, returning 16380 1727204172.69991: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-749c-b6eb-000000000044] 16380 1727204172.69994: sending task result for task 12b410aa-8751-749c-b6eb-000000000044 16380 1727204172.70130: done sending task result for task 12b410aa-8751-749c-b6eb-000000000044 16380 1727204172.70133: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204172.70194: no more pending results, returning what we have 16380 1727204172.70198: results queue empty 16380 1727204172.70199: checking for any_errors_fatal 16380 1727204172.70207: done checking for any_errors_fatal 16380 1727204172.70207: checking for max_fail_percentage 16380 1727204172.70209: done checking for max_fail_percentage 16380 1727204172.70211: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.70211: done checking to see if all hosts have failed 16380 1727204172.70213: getting the remaining hosts for this loop 16380 1727204172.70215: done getting the remaining hosts for this loop 16380 1727204172.70221: getting the next task for host managed-node2 16380 1727204172.70231: done getting next task for host managed-node2 16380 1727204172.70236: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16380 1727204172.70238: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.70254: getting variables 16380 1727204172.70256: in VariableManager get_vars() 16380 1727204172.70298: Calling all_inventory to load vars for managed-node2 16380 1727204172.70302: Calling groups_inventory to load vars for managed-node2 16380 1727204172.70304: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.70319: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.70322: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.70326: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.72168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.73790: done with get_vars() 16380 1727204172.73820: done getting variables 16380 1727204172.73876: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.052) 0:00:33.845 ***** 16380 1727204172.73906: entering _queue_task() for managed-node2/service 16380 1727204172.74181: worker is 1 (out of 1 available) 16380 1727204172.74197: exiting _queue_task() for managed-node2/service 16380 1727204172.74212: done queuing things up, now waiting for results queue to drain 16380 1727204172.74214: waiting for pending results... 16380 1727204172.74405: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16380 1727204172.74494: in run() - task 12b410aa-8751-749c-b6eb-000000000045 16380 1727204172.74509: variable 'ansible_search_path' from source: unknown 16380 1727204172.74513: variable 'ansible_search_path' from source: unknown 16380 1727204172.74555: calling self._execute() 16380 1727204172.74638: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.74644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.74656: variable 'omit' from source: magic vars 16380 1727204172.74989: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.75003: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.75112: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.75283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.77498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.77586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.77656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.77707: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.77737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.77809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.77849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.77884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.78094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.78097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.78101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.78106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.78145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.78180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.78198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.78293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.78299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.78347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.78407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.78423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.78574: variable 'network_connections' from source: play vars 16380 1727204172.78586: variable 'profile' from source: play vars 16380 1727204172.78651: variable 'profile' from source: play vars 16380 1727204172.78655: variable 'interface' from source: set_fact 16380 1727204172.78708: variable 'interface' from source: set_fact 16380 1727204172.78776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204172.79094: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204172.79098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204172.79100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204172.79103: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204172.79135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204172.79168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204172.79208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.79250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204172.79311: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204172.79645: variable 'network_connections' from source: play vars 16380 1727204172.79903: variable 'profile' from source: play vars 16380 1727204172.79985: variable 'profile' from source: play vars 16380 1727204172.79997: variable 'interface' from source: set_fact 16380 1727204172.80075: variable 'interface' from source: set_fact 16380 1727204172.80241: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204172.80245: when evaluation is False, skipping this task 16380 1727204172.80248: _execute() done 16380 1727204172.80250: dumping result to json 16380 1727204172.80256: done dumping result, returning 16380 1727204172.80266: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000045] 16380 1727204172.80277: sending task result for task 12b410aa-8751-749c-b6eb-000000000045 16380 1727204172.80577: done sending task result for task 12b410aa-8751-749c-b6eb-000000000045 16380 1727204172.80581: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204172.80632: no more pending results, returning what we have 16380 1727204172.80635: results queue empty 16380 1727204172.80636: checking for any_errors_fatal 16380 1727204172.80644: done checking for any_errors_fatal 16380 1727204172.80645: checking for max_fail_percentage 16380 1727204172.80647: done checking for max_fail_percentage 16380 1727204172.80648: checking to see if all hosts have failed and the running result is not ok 16380 1727204172.80649: done checking to see if all hosts have failed 16380 1727204172.80650: getting the remaining hosts for this loop 16380 1727204172.80651: done getting the remaining hosts for this loop 16380 1727204172.80655: getting the next task for host managed-node2 16380 1727204172.80662: done getting next task for host managed-node2 16380 1727204172.80665: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16380 1727204172.80667: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204172.80683: getting variables 16380 1727204172.80684: in VariableManager get_vars() 16380 1727204172.80741: Calling all_inventory to load vars for managed-node2 16380 1727204172.80744: Calling groups_inventory to load vars for managed-node2 16380 1727204172.80751: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204172.80761: Calling all_plugins_play to load vars for managed-node2 16380 1727204172.80766: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204172.80770: Calling groups_plugins_play to load vars for managed-node2 16380 1727204172.82115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204172.84593: done with get_vars() 16380 1727204172.84632: done getting variables 16380 1727204172.84705: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:12 -0400 (0:00:00.108) 0:00:33.954 ***** 16380 1727204172.84742: entering _queue_task() for managed-node2/service 16380 1727204172.85132: worker is 1 (out of 1 available) 16380 1727204172.85145: exiting _queue_task() for managed-node2/service 16380 1727204172.85158: done queuing things up, now waiting for results queue to drain 16380 1727204172.85161: waiting for pending results... 16380 1727204172.85425: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16380 1727204172.85550: in run() - task 12b410aa-8751-749c-b6eb-000000000046 16380 1727204172.85575: variable 'ansible_search_path' from source: unknown 16380 1727204172.85585: variable 'ansible_search_path' from source: unknown 16380 1727204172.85634: calling self._execute() 16380 1727204172.85745: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.85759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.85832: variable 'omit' from source: magic vars 16380 1727204172.86238: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.86257: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204172.86491: variable 'network_provider' from source: set_fact 16380 1727204172.86504: variable 'network_state' from source: role '' defaults 16380 1727204172.86519: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16380 1727204172.86531: variable 'omit' from source: magic vars 16380 1727204172.86576: variable 'omit' from source: magic vars 16380 1727204172.86623: variable 'network_service_name' from source: role '' defaults 16380 1727204172.86712: variable 'network_service_name' from source: role '' defaults 16380 1727204172.86860: variable '__network_provider_setup' from source: role '' defaults 16380 1727204172.86915: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204172.86958: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204172.86972: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204172.87058: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204172.87378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204172.89937: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204172.90033: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204172.90096: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204172.90167: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204172.90194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204172.90386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.90391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.90394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.90438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.90463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.90535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.90569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.90612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.90667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.90693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.91021: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16380 1727204172.91192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.91228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.91271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.91368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.91371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.91444: variable 'ansible_python' from source: facts 16380 1727204172.91478: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16380 1727204172.91591: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204172.91686: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204172.91857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.91897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.91995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.91999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.92010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.92078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204172.92123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204172.92164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.92242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204172.92246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204172.92432: variable 'network_connections' from source: play vars 16380 1727204172.92445: variable 'profile' from source: play vars 16380 1727204172.92568: variable 'profile' from source: play vars 16380 1727204172.92572: variable 'interface' from source: set_fact 16380 1727204172.92633: variable 'interface' from source: set_fact 16380 1727204172.92769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204172.93014: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204172.93098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204172.93143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204172.93214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204172.93301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204172.93395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204172.93399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204172.93438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204172.93504: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.93907: variable 'network_connections' from source: play vars 16380 1727204172.93921: variable 'profile' from source: play vars 16380 1727204172.94018: variable 'profile' from source: play vars 16380 1727204172.94095: variable 'interface' from source: set_fact 16380 1727204172.94112: variable 'interface' from source: set_fact 16380 1727204172.94155: variable '__network_packages_default_wireless' from source: role '' defaults 16380 1727204172.94266: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204172.94668: variable 'network_connections' from source: play vars 16380 1727204172.94679: variable 'profile' from source: play vars 16380 1727204172.94773: variable 'profile' from source: play vars 16380 1727204172.94784: variable 'interface' from source: set_fact 16380 1727204172.94884: variable 'interface' from source: set_fact 16380 1727204172.94922: variable '__network_packages_default_team' from source: role '' defaults 16380 1727204172.95196: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204172.95430: variable 'network_connections' from source: play vars 16380 1727204172.95441: variable 'profile' from source: play vars 16380 1727204172.95535: variable 'profile' from source: play vars 16380 1727204172.95546: variable 'interface' from source: set_fact 16380 1727204172.95643: variable 'interface' from source: set_fact 16380 1727204172.95718: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204172.95802: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204172.95815: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204172.95897: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204172.96209: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16380 1727204172.96888: variable 'network_connections' from source: play vars 16380 1727204172.96904: variable 'profile' from source: play vars 16380 1727204172.96983: variable 'profile' from source: play vars 16380 1727204172.96999: variable 'interface' from source: set_fact 16380 1727204172.97088: variable 'interface' from source: set_fact 16380 1727204172.97107: variable 'ansible_distribution' from source: facts 16380 1727204172.97116: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.97127: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.97149: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16380 1727204172.97381: variable 'ansible_distribution' from source: facts 16380 1727204172.97395: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.97485: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.97488: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16380 1727204172.97617: variable 'ansible_distribution' from source: facts 16380 1727204172.97626: variable '__network_rh_distros' from source: role '' defaults 16380 1727204172.97636: variable 'ansible_distribution_major_version' from source: facts 16380 1727204172.97755: variable 'network_provider' from source: set_fact 16380 1727204172.97795: variable 'omit' from source: magic vars 16380 1727204172.97841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204172.97885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204172.97917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204172.97953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204172.97973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204172.98019: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204172.98036: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.98094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.98180: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204172.98198: Set connection var ansible_shell_executable to /bin/sh 16380 1727204172.98211: Set connection var ansible_connection to ssh 16380 1727204172.98226: Set connection var ansible_shell_type to sh 16380 1727204172.98240: Set connection var ansible_pipelining to False 16380 1727204172.98261: Set connection var ansible_timeout to 10 16380 1727204172.98298: variable 'ansible_shell_executable' from source: unknown 16380 1727204172.98307: variable 'ansible_connection' from source: unknown 16380 1727204172.98316: variable 'ansible_module_compression' from source: unknown 16380 1727204172.98362: variable 'ansible_shell_type' from source: unknown 16380 1727204172.98366: variable 'ansible_shell_executable' from source: unknown 16380 1727204172.98369: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204172.98376: variable 'ansible_pipelining' from source: unknown 16380 1727204172.98378: variable 'ansible_timeout' from source: unknown 16380 1727204172.98380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204172.98514: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204172.98535: variable 'omit' from source: magic vars 16380 1727204172.98580: starting attempt loop 16380 1727204172.98584: running the handler 16380 1727204172.98660: variable 'ansible_facts' from source: unknown 16380 1727204172.99894: _low_level_execute_command(): starting 16380 1727204172.99908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204173.00718: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204173.00770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204173.00825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204173.00853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204173.00895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204173.00967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204173.02752: stdout chunk (state=3): >>>/root <<< 16380 1727204173.02970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204173.02974: stdout chunk (state=3): >>><<< 16380 1727204173.02976: stderr chunk (state=3): >>><<< 16380 1727204173.03110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204173.03114: _low_level_execute_command(): starting 16380 1727204173.03120: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028 `" && echo ansible-tmp-1727204173.0300004-18634-106038011336028="` echo /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028 `" ) && sleep 0' 16380 1727204173.03732: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204173.03772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204173.03887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204173.03911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204173.03999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204173.06113: stdout chunk (state=3): >>>ansible-tmp-1727204173.0300004-18634-106038011336028=/root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028 <<< 16380 1727204173.06309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204173.06313: stdout chunk (state=3): >>><<< 16380 1727204173.06503: stderr chunk (state=3): >>><<< 16380 1727204173.06507: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204173.0300004-18634-106038011336028=/root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204173.06510: variable 'ansible_module_compression' from source: unknown 16380 1727204173.06695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 16380 1727204173.06698: variable 'ansible_facts' from source: unknown 16380 1727204173.06814: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py 16380 1727204173.07014: Sending initial data 16380 1727204173.07021: Sent initial data (156 bytes) 16380 1727204173.07609: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204173.07622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204173.07632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204173.07705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204173.07746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204173.07760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204173.07770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204173.07841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204173.09512: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16380 1727204173.09539: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 16380 1727204173.09567: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204173.09605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204173.09675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpuu67c5ru /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py <<< 16380 1727204173.09704: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py" <<< 16380 1727204173.09709: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpuu67c5ru" to remote "/root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py" <<< 16380 1727204173.12305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204173.12310: stdout chunk (state=3): >>><<< 16380 1727204173.12312: stderr chunk (state=3): >>><<< 16380 1727204173.12315: done transferring module to remote 16380 1727204173.12317: _low_level_execute_command(): starting 16380 1727204173.12320: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/ /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py && sleep 0' 16380 1727204173.13020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204173.13090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204173.13157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204173.13211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204173.13298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204173.15302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204173.15325: stdout chunk (state=3): >>><<< 16380 1727204173.15329: stderr chunk (state=3): >>><<< 16380 1727204173.15433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204173.15437: _low_level_execute_command(): starting 16380 1727204173.15439: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/AnsiballZ_systemd.py && sleep 0' 16380 1727204173.15979: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204173.16000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204173.16016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204173.16035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204173.16054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204173.16063: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204173.16074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204173.16104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204173.16183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204173.16236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204173.16292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204173.49545: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4456448", "MemoryAvailable": "infinity", "CPUUsageNSec": "1176350000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 16380 1727204173.49586: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16380 1727204173.51921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204173.51926: stdout chunk (state=3): >>><<< 16380 1727204173.51929: stderr chunk (state=3): >>><<< 16380 1727204173.52096: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4456448", "MemoryAvailable": "infinity", "CPUUsageNSec": "1176350000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204173.52734: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204173.52780: _low_level_execute_command(): starting 16380 1727204173.52910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204173.0300004-18634-106038011336028/ > /dev/null 2>&1 && sleep 0' 16380 1727204173.53995: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204173.54194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204173.54198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204173.54522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204173.56503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204173.56557: stderr chunk (state=3): >>><<< 16380 1727204173.56564: stdout chunk (state=3): >>><<< 16380 1727204173.56584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204173.56596: handler run complete 16380 1727204173.56646: attempt loop complete, returning result 16380 1727204173.56652: _execute() done 16380 1727204173.56657: dumping result to json 16380 1727204173.56674: done dumping result, returning 16380 1727204173.56687: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-749c-b6eb-000000000046] 16380 1727204173.56697: sending task result for task 12b410aa-8751-749c-b6eb-000000000046 16380 1727204173.56910: done sending task result for task 12b410aa-8751-749c-b6eb-000000000046 16380 1727204173.56914: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204173.56969: no more pending results, returning what we have 16380 1727204173.56972: results queue empty 16380 1727204173.56973: checking for any_errors_fatal 16380 1727204173.56980: done checking for any_errors_fatal 16380 1727204173.56980: checking for max_fail_percentage 16380 1727204173.56982: done checking for max_fail_percentage 16380 1727204173.56983: checking to see if all hosts have failed and the running result is not ok 16380 1727204173.56984: done checking to see if all hosts have failed 16380 1727204173.56985: getting the remaining hosts for this loop 16380 1727204173.56988: done getting the remaining hosts for this loop 16380 1727204173.56994: getting the next task for host managed-node2 16380 1727204173.57002: done getting next task for host managed-node2 16380 1727204173.57005: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16380 1727204173.57008: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204173.57019: getting variables 16380 1727204173.57021: in VariableManager get_vars() 16380 1727204173.57064: Calling all_inventory to load vars for managed-node2 16380 1727204173.57067: Calling groups_inventory to load vars for managed-node2 16380 1727204173.57070: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204173.57081: Calling all_plugins_play to load vars for managed-node2 16380 1727204173.57084: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204173.57088: Calling groups_plugins_play to load vars for managed-node2 16380 1727204173.59669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204173.63040: done with get_vars() 16380 1727204173.63081: done getting variables 16380 1727204173.63164: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:13 -0400 (0:00:00.784) 0:00:34.738 ***** 16380 1727204173.63204: entering _queue_task() for managed-node2/service 16380 1727204173.63794: worker is 1 (out of 1 available) 16380 1727204173.63810: exiting _queue_task() for managed-node2/service 16380 1727204173.63826: done queuing things up, now waiting for results queue to drain 16380 1727204173.63828: waiting for pending results... 16380 1727204173.64407: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16380 1727204173.64486: in run() - task 12b410aa-8751-749c-b6eb-000000000047 16380 1727204173.64520: variable 'ansible_search_path' from source: unknown 16380 1727204173.64530: variable 'ansible_search_path' from source: unknown 16380 1727204173.64583: calling self._execute() 16380 1727204173.64765: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204173.64769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204173.64771: variable 'omit' from source: magic vars 16380 1727204173.65423: variable 'ansible_distribution_major_version' from source: facts 16380 1727204173.65446: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204173.65612: variable 'network_provider' from source: set_fact 16380 1727204173.65635: Evaluated conditional (network_provider == "nm"): True 16380 1727204173.65795: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204173.65901: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204173.66163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204173.69123: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204173.69295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204173.69299: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204173.69324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204173.69359: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204173.69480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204173.69550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204173.69596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204173.69664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204173.69685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204173.69760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204173.69796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204173.69894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204173.69926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204173.69957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204173.70013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204173.70055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204173.70091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204173.70168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204173.70184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204173.70397: variable 'network_connections' from source: play vars 16380 1727204173.70420: variable 'profile' from source: play vars 16380 1727204173.70516: variable 'profile' from source: play vars 16380 1727204173.70694: variable 'interface' from source: set_fact 16380 1727204173.70698: variable 'interface' from source: set_fact 16380 1727204173.70712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204173.70943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204173.70996: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204173.71049: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204173.71086: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204173.71157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204173.71188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204173.71228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204173.71272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204173.71334: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204173.71715: variable 'network_connections' from source: play vars 16380 1727204173.71730: variable 'profile' from source: play vars 16380 1727204173.71812: variable 'profile' from source: play vars 16380 1727204173.71894: variable 'interface' from source: set_fact 16380 1727204173.71898: variable 'interface' from source: set_fact 16380 1727204173.71949: Evaluated conditional (__network_wpa_supplicant_required): False 16380 1727204173.71957: when evaluation is False, skipping this task 16380 1727204173.71964: _execute() done 16380 1727204173.71977: dumping result to json 16380 1727204173.71984: done dumping result, returning 16380 1727204173.71998: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-749c-b6eb-000000000047] 16380 1727204173.72007: sending task result for task 12b410aa-8751-749c-b6eb-000000000047 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16380 1727204173.72172: no more pending results, returning what we have 16380 1727204173.72176: results queue empty 16380 1727204173.72177: checking for any_errors_fatal 16380 1727204173.72208: done checking for any_errors_fatal 16380 1727204173.72210: checking for max_fail_percentage 16380 1727204173.72213: done checking for max_fail_percentage 16380 1727204173.72214: checking to see if all hosts have failed and the running result is not ok 16380 1727204173.72215: done checking to see if all hosts have failed 16380 1727204173.72219: getting the remaining hosts for this loop 16380 1727204173.72221: done getting the remaining hosts for this loop 16380 1727204173.72293: getting the next task for host managed-node2 16380 1727204173.72304: done getting next task for host managed-node2 16380 1727204173.72309: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16380 1727204173.72312: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204173.72332: getting variables 16380 1727204173.72335: in VariableManager get_vars() 16380 1727204173.72638: Calling all_inventory to load vars for managed-node2 16380 1727204173.72642: Calling groups_inventory to load vars for managed-node2 16380 1727204173.72645: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204173.72660: Calling all_plugins_play to load vars for managed-node2 16380 1727204173.72664: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204173.72668: Calling groups_plugins_play to load vars for managed-node2 16380 1727204173.73273: done sending task result for task 12b410aa-8751-749c-b6eb-000000000047 16380 1727204173.73276: WORKER PROCESS EXITING 16380 1727204173.75064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204173.78194: done with get_vars() 16380 1727204173.78248: done getting variables 16380 1727204173.78334: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:13 -0400 (0:00:00.151) 0:00:34.890 ***** 16380 1727204173.78372: entering _queue_task() for managed-node2/service 16380 1727204173.78884: worker is 1 (out of 1 available) 16380 1727204173.78898: exiting _queue_task() for managed-node2/service 16380 1727204173.78911: done queuing things up, now waiting for results queue to drain 16380 1727204173.78913: waiting for pending results... 16380 1727204173.79136: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16380 1727204173.79274: in run() - task 12b410aa-8751-749c-b6eb-000000000048 16380 1727204173.79302: variable 'ansible_search_path' from source: unknown 16380 1727204173.79312: variable 'ansible_search_path' from source: unknown 16380 1727204173.79363: calling self._execute() 16380 1727204173.79481: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204173.79498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204173.79523: variable 'omit' from source: magic vars 16380 1727204173.79995: variable 'ansible_distribution_major_version' from source: facts 16380 1727204173.80021: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204173.80193: variable 'network_provider' from source: set_fact 16380 1727204173.80208: Evaluated conditional (network_provider == "initscripts"): False 16380 1727204173.80220: when evaluation is False, skipping this task 16380 1727204173.80234: _execute() done 16380 1727204173.80244: dumping result to json 16380 1727204173.80253: done dumping result, returning 16380 1727204173.80267: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-749c-b6eb-000000000048] 16380 1727204173.80283: sending task result for task 12b410aa-8751-749c-b6eb-000000000048 16380 1727204173.80519: done sending task result for task 12b410aa-8751-749c-b6eb-000000000048 16380 1727204173.80523: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204173.80576: no more pending results, returning what we have 16380 1727204173.80581: results queue empty 16380 1727204173.80582: checking for any_errors_fatal 16380 1727204173.80597: done checking for any_errors_fatal 16380 1727204173.80599: checking for max_fail_percentage 16380 1727204173.80601: done checking for max_fail_percentage 16380 1727204173.80602: checking to see if all hosts have failed and the running result is not ok 16380 1727204173.80603: done checking to see if all hosts have failed 16380 1727204173.80605: getting the remaining hosts for this loop 16380 1727204173.80607: done getting the remaining hosts for this loop 16380 1727204173.80612: getting the next task for host managed-node2 16380 1727204173.80624: done getting next task for host managed-node2 16380 1727204173.80629: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16380 1727204173.80633: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204173.80653: getting variables 16380 1727204173.80655: in VariableManager get_vars() 16380 1727204173.80824: Calling all_inventory to load vars for managed-node2 16380 1727204173.80828: Calling groups_inventory to load vars for managed-node2 16380 1727204173.80831: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204173.80844: Calling all_plugins_play to load vars for managed-node2 16380 1727204173.80847: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204173.80851: Calling groups_plugins_play to load vars for managed-node2 16380 1727204173.88930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204173.92742: done with get_vars() 16380 1727204173.92788: done getting variables 16380 1727204173.92861: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:13 -0400 (0:00:00.145) 0:00:35.035 ***** 16380 1727204173.92899: entering _queue_task() for managed-node2/copy 16380 1727204173.93272: worker is 1 (out of 1 available) 16380 1727204173.93286: exiting _queue_task() for managed-node2/copy 16380 1727204173.93403: done queuing things up, now waiting for results queue to drain 16380 1727204173.93406: waiting for pending results... 16380 1727204173.93649: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16380 1727204173.93896: in run() - task 12b410aa-8751-749c-b6eb-000000000049 16380 1727204173.93901: variable 'ansible_search_path' from source: unknown 16380 1727204173.93905: variable 'ansible_search_path' from source: unknown 16380 1727204173.93908: calling self._execute() 16380 1727204173.93986: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204173.94006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204173.94030: variable 'omit' from source: magic vars 16380 1727204173.94534: variable 'ansible_distribution_major_version' from source: facts 16380 1727204173.94553: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204173.94733: variable 'network_provider' from source: set_fact 16380 1727204173.94750: Evaluated conditional (network_provider == "initscripts"): False 16380 1727204173.94762: when evaluation is False, skipping this task 16380 1727204173.94771: _execute() done 16380 1727204173.94892: dumping result to json 16380 1727204173.94898: done dumping result, returning 16380 1727204173.94905: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-749c-b6eb-000000000049] 16380 1727204173.94908: sending task result for task 12b410aa-8751-749c-b6eb-000000000049 16380 1727204173.94988: done sending task result for task 12b410aa-8751-749c-b6eb-000000000049 16380 1727204173.94994: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16380 1727204173.95053: no more pending results, returning what we have 16380 1727204173.95058: results queue empty 16380 1727204173.95059: checking for any_errors_fatal 16380 1727204173.95067: done checking for any_errors_fatal 16380 1727204173.95069: checking for max_fail_percentage 16380 1727204173.95071: done checking for max_fail_percentage 16380 1727204173.95072: checking to see if all hosts have failed and the running result is not ok 16380 1727204173.95073: done checking to see if all hosts have failed 16380 1727204173.95074: getting the remaining hosts for this loop 16380 1727204173.95076: done getting the remaining hosts for this loop 16380 1727204173.95081: getting the next task for host managed-node2 16380 1727204173.95294: done getting next task for host managed-node2 16380 1727204173.95300: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16380 1727204173.95303: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204173.95321: getting variables 16380 1727204173.95324: in VariableManager get_vars() 16380 1727204173.95371: Calling all_inventory to load vars for managed-node2 16380 1727204173.95375: Calling groups_inventory to load vars for managed-node2 16380 1727204173.95378: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204173.95395: Calling all_plugins_play to load vars for managed-node2 16380 1727204173.95399: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204173.95404: Calling groups_plugins_play to load vars for managed-node2 16380 1727204173.97853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204174.00928: done with get_vars() 16380 1727204174.00970: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.081) 0:00:35.117 ***** 16380 1727204174.01084: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16380 1727204174.01483: worker is 1 (out of 1 available) 16380 1727204174.01603: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16380 1727204174.01619: done queuing things up, now waiting for results queue to drain 16380 1727204174.01622: waiting for pending results... 16380 1727204174.02206: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16380 1727204174.02212: in run() - task 12b410aa-8751-749c-b6eb-00000000004a 16380 1727204174.02216: variable 'ansible_search_path' from source: unknown 16380 1727204174.02219: variable 'ansible_search_path' from source: unknown 16380 1727204174.02222: calling self._execute() 16380 1727204174.02226: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204174.02230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204174.02234: variable 'omit' from source: magic vars 16380 1727204174.02684: variable 'ansible_distribution_major_version' from source: facts 16380 1727204174.02698: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204174.02706: variable 'omit' from source: magic vars 16380 1727204174.02753: variable 'omit' from source: magic vars 16380 1727204174.03195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204174.05784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204174.05875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204174.05927: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204174.05992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204174.06026: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204174.06129: variable 'network_provider' from source: set_fact 16380 1727204174.06319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204174.06356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204174.06392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204174.06443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204174.06459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204174.06795: variable 'omit' from source: magic vars 16380 1727204174.06799: variable 'omit' from source: magic vars 16380 1727204174.06837: variable 'network_connections' from source: play vars 16380 1727204174.06851: variable 'profile' from source: play vars 16380 1727204174.06929: variable 'profile' from source: play vars 16380 1727204174.06941: variable 'interface' from source: set_fact 16380 1727204174.07011: variable 'interface' from source: set_fact 16380 1727204174.07194: variable 'omit' from source: magic vars 16380 1727204174.07205: variable '__lsr_ansible_managed' from source: task vars 16380 1727204174.07285: variable '__lsr_ansible_managed' from source: task vars 16380 1727204174.07594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16380 1727204174.07827: Loaded config def from plugin (lookup/template) 16380 1727204174.07833: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16380 1727204174.07870: File lookup term: get_ansible_managed.j2 16380 1727204174.07873: variable 'ansible_search_path' from source: unknown 16380 1727204174.07880: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16380 1727204174.07899: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16380 1727204174.07926: variable 'ansible_search_path' from source: unknown 16380 1727204174.19069: variable 'ansible_managed' from source: unknown 16380 1727204174.19497: variable 'omit' from source: magic vars 16380 1727204174.19503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204174.19509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204174.19513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204174.19516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204174.19520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204174.19523: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204174.19527: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204174.19530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204174.19624: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204174.19633: Set connection var ansible_shell_executable to /bin/sh 16380 1727204174.19641: Set connection var ansible_connection to ssh 16380 1727204174.19655: Set connection var ansible_shell_type to sh 16380 1727204174.19663: Set connection var ansible_pipelining to False 16380 1727204174.19674: Set connection var ansible_timeout to 10 16380 1727204174.19706: variable 'ansible_shell_executable' from source: unknown 16380 1727204174.19710: variable 'ansible_connection' from source: unknown 16380 1727204174.19714: variable 'ansible_module_compression' from source: unknown 16380 1727204174.19716: variable 'ansible_shell_type' from source: unknown 16380 1727204174.19723: variable 'ansible_shell_executable' from source: unknown 16380 1727204174.19727: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204174.19733: variable 'ansible_pipelining' from source: unknown 16380 1727204174.19737: variable 'ansible_timeout' from source: unknown 16380 1727204174.19744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204174.19996: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204174.20008: variable 'omit' from source: magic vars 16380 1727204174.20011: starting attempt loop 16380 1727204174.20014: running the handler 16380 1727204174.20016: _low_level_execute_command(): starting 16380 1727204174.20018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204174.20686: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204174.20700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204174.20712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204174.20732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204174.20810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.20847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204174.20868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204174.20890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204174.21092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204174.23045: stdout chunk (state=3): >>>/root <<< 16380 1727204174.23141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204174.23147: stdout chunk (state=3): >>><<< 16380 1727204174.23158: stderr chunk (state=3): >>><<< 16380 1727204174.23186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204174.23204: _low_level_execute_command(): starting 16380 1727204174.23245: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691 `" && echo ansible-tmp-1727204174.2318716-18677-219352634253691="` echo /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691 `" ) && sleep 0' 16380 1727204174.24634: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204174.24638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.24641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204174.24644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.24853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204174.24869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204174.24976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204174.25061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204174.27182: stdout chunk (state=3): >>>ansible-tmp-1727204174.2318716-18677-219352634253691=/root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691 <<< 16380 1727204174.27308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204174.27507: stderr chunk (state=3): >>><<< 16380 1727204174.27511: stdout chunk (state=3): >>><<< 16380 1727204174.27541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204174.2318716-18677-219352634253691=/root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204174.27695: variable 'ansible_module_compression' from source: unknown 16380 1727204174.27699: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 16380 1727204174.27702: variable 'ansible_facts' from source: unknown 16380 1727204174.27971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py 16380 1727204174.28185: Sending initial data 16380 1727204174.28192: Sent initial data (168 bytes) 16380 1727204174.29307: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.29377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204174.29380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204174.29508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204174.29580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204174.31296: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16380 1727204174.31304: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 16380 1727204174.31312: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 16380 1727204174.31322: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 16380 1727204174.31359: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 16380 1727204174.31362: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 16380 1727204174.31367: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204174.31393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204174.31438: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpykax48gu /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py <<< 16380 1727204174.31442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py" <<< 16380 1727204174.31484: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpykax48gu" to remote "/root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py" <<< 16380 1727204174.34078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204174.34131: stderr chunk (state=3): >>><<< 16380 1727204174.34146: stdout chunk (state=3): >>><<< 16380 1727204174.34180: done transferring module to remote 16380 1727204174.34205: _low_level_execute_command(): starting 16380 1727204174.34220: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/ /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py && sleep 0' 16380 1727204174.34864: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204174.34879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204174.34897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204174.34914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204174.35007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.35034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204174.35049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204174.35069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204174.35141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204174.37361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204174.37372: stdout chunk (state=3): >>><<< 16380 1727204174.37384: stderr chunk (state=3): >>><<< 16380 1727204174.37410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204174.37422: _low_level_execute_command(): starting 16380 1727204174.37434: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/AnsiballZ_network_connections.py && sleep 0' 16380 1727204174.38075: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204174.38099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204174.38116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204174.38138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204174.38158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204174.38171: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204174.38185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.38206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204174.38306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204174.38335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204174.38416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204174.73128: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16380 1727204174.75699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204174.75744: stderr chunk (state=3): >>><<< 16380 1727204174.75788: stdout chunk (state=3): >>><<< 16380 1727204174.75822: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204174.75879: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204174.75908: _low_level_execute_command(): starting 16380 1727204174.75923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204174.2318716-18677-219352634253691/ > /dev/null 2>&1 && sleep 0' 16380 1727204174.76665: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204174.76671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204174.76731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204174.76752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204174.76794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204174.76869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204174.79196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204174.79199: stdout chunk (state=3): >>><<< 16380 1727204174.79205: stderr chunk (state=3): >>><<< 16380 1727204174.79208: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204174.79210: handler run complete 16380 1727204174.79212: attempt loop complete, returning result 16380 1727204174.79214: _execute() done 16380 1727204174.79216: dumping result to json 16380 1727204174.79219: done dumping result, returning 16380 1727204174.79221: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-749c-b6eb-00000000004a] 16380 1727204174.79225: sending task result for task 12b410aa-8751-749c-b6eb-00000000004a 16380 1727204174.79309: done sending task result for task 12b410aa-8751-749c-b6eb-00000000004a 16380 1727204174.79313: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 16380 1727204174.79467: no more pending results, returning what we have 16380 1727204174.79471: results queue empty 16380 1727204174.79471: checking for any_errors_fatal 16380 1727204174.79477: done checking for any_errors_fatal 16380 1727204174.79485: checking for max_fail_percentage 16380 1727204174.79486: done checking for max_fail_percentage 16380 1727204174.79487: checking to see if all hosts have failed and the running result is not ok 16380 1727204174.79488: done checking to see if all hosts have failed 16380 1727204174.79491: getting the remaining hosts for this loop 16380 1727204174.79494: done getting the remaining hosts for this loop 16380 1727204174.79498: getting the next task for host managed-node2 16380 1727204174.79510: done getting next task for host managed-node2 16380 1727204174.79514: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16380 1727204174.79519: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204174.79533: getting variables 16380 1727204174.79535: in VariableManager get_vars() 16380 1727204174.79587: Calling all_inventory to load vars for managed-node2 16380 1727204174.79593: Calling groups_inventory to load vars for managed-node2 16380 1727204174.79596: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204174.79615: Calling all_plugins_play to load vars for managed-node2 16380 1727204174.79621: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204174.79625: Calling groups_plugins_play to load vars for managed-node2 16380 1727204174.84766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204174.91804: done with get_vars() 16380 1727204174.91852: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.908) 0:00:36.026 ***** 16380 1727204174.91957: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16380 1727204174.92531: worker is 1 (out of 1 available) 16380 1727204174.92550: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16380 1727204174.92564: done queuing things up, now waiting for results queue to drain 16380 1727204174.92566: waiting for pending results... 16380 1727204174.93110: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16380 1727204174.93208: in run() - task 12b410aa-8751-749c-b6eb-00000000004b 16380 1727204174.93345: variable 'ansible_search_path' from source: unknown 16380 1727204174.93349: variable 'ansible_search_path' from source: unknown 16380 1727204174.93497: calling self._execute() 16380 1727204174.93616: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204174.93627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204174.93644: variable 'omit' from source: magic vars 16380 1727204174.94730: variable 'ansible_distribution_major_version' from source: facts 16380 1727204174.94779: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204174.95232: variable 'network_state' from source: role '' defaults 16380 1727204174.95246: Evaluated conditional (network_state != {}): False 16380 1727204174.95250: when evaluation is False, skipping this task 16380 1727204174.95253: _execute() done 16380 1727204174.95292: dumping result to json 16380 1727204174.95296: done dumping result, returning 16380 1727204174.95299: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-749c-b6eb-00000000004b] 16380 1727204174.95301: sending task result for task 12b410aa-8751-749c-b6eb-00000000004b 16380 1727204174.95533: done sending task result for task 12b410aa-8751-749c-b6eb-00000000004b 16380 1727204174.95537: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204174.95622: no more pending results, returning what we have 16380 1727204174.95627: results queue empty 16380 1727204174.95628: checking for any_errors_fatal 16380 1727204174.95648: done checking for any_errors_fatal 16380 1727204174.95650: checking for max_fail_percentage 16380 1727204174.95652: done checking for max_fail_percentage 16380 1727204174.95653: checking to see if all hosts have failed and the running result is not ok 16380 1727204174.95654: done checking to see if all hosts have failed 16380 1727204174.95655: getting the remaining hosts for this loop 16380 1727204174.95658: done getting the remaining hosts for this loop 16380 1727204174.95662: getting the next task for host managed-node2 16380 1727204174.95672: done getting next task for host managed-node2 16380 1727204174.95676: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16380 1727204174.95680: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204174.95704: getting variables 16380 1727204174.95707: in VariableManager get_vars() 16380 1727204174.95756: Calling all_inventory to load vars for managed-node2 16380 1727204174.95759: Calling groups_inventory to load vars for managed-node2 16380 1727204174.95763: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204174.96193: Calling all_plugins_play to load vars for managed-node2 16380 1727204174.96199: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204174.96204: Calling groups_plugins_play to load vars for managed-node2 16380 1727204174.99164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.02284: done with get_vars() 16380 1727204175.02329: done getting variables 16380 1727204175.02413: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.104) 0:00:36.131 ***** 16380 1727204175.02452: entering _queue_task() for managed-node2/debug 16380 1727204175.03010: worker is 1 (out of 1 available) 16380 1727204175.03022: exiting _queue_task() for managed-node2/debug 16380 1727204175.03034: done queuing things up, now waiting for results queue to drain 16380 1727204175.03036: waiting for pending results... 16380 1727204175.03610: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16380 1727204175.03767: in run() - task 12b410aa-8751-749c-b6eb-00000000004c 16380 1727204175.03785: variable 'ansible_search_path' from source: unknown 16380 1727204175.03788: variable 'ansible_search_path' from source: unknown 16380 1727204175.04095: calling self._execute() 16380 1727204175.04141: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.04149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.04161: variable 'omit' from source: magic vars 16380 1727204175.05158: variable 'ansible_distribution_major_version' from source: facts 16380 1727204175.05172: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204175.05179: variable 'omit' from source: magic vars 16380 1727204175.05343: variable 'omit' from source: magic vars 16380 1727204175.05387: variable 'omit' from source: magic vars 16380 1727204175.05563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204175.05608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204175.05707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204175.05735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204175.05842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204175.05881: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204175.05885: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.05892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.06130: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204175.06138: Set connection var ansible_shell_executable to /bin/sh 16380 1727204175.06146: Set connection var ansible_connection to ssh 16380 1727204175.06153: Set connection var ansible_shell_type to sh 16380 1727204175.06160: Set connection var ansible_pipelining to False 16380 1727204175.06171: Set connection var ansible_timeout to 10 16380 1727204175.06395: variable 'ansible_shell_executable' from source: unknown 16380 1727204175.06399: variable 'ansible_connection' from source: unknown 16380 1727204175.06402: variable 'ansible_module_compression' from source: unknown 16380 1727204175.06404: variable 'ansible_shell_type' from source: unknown 16380 1727204175.06407: variable 'ansible_shell_executable' from source: unknown 16380 1727204175.06409: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.06411: variable 'ansible_pipelining' from source: unknown 16380 1727204175.06695: variable 'ansible_timeout' from source: unknown 16380 1727204175.06698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.06702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204175.06714: variable 'omit' from source: magic vars 16380 1727204175.06727: starting attempt loop 16380 1727204175.06772: running the handler 16380 1727204175.06937: variable '__network_connections_result' from source: set_fact 16380 1727204175.07005: handler run complete 16380 1727204175.07029: attempt loop complete, returning result 16380 1727204175.07033: _execute() done 16380 1727204175.07036: dumping result to json 16380 1727204175.07041: done dumping result, returning 16380 1727204175.07053: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-749c-b6eb-00000000004c] 16380 1727204175.07058: sending task result for task 12b410aa-8751-749c-b6eb-00000000004c 16380 1727204175.07171: done sending task result for task 12b410aa-8751-749c-b6eb-00000000004c 16380 1727204175.07176: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 16380 1727204175.07250: no more pending results, returning what we have 16380 1727204175.07254: results queue empty 16380 1727204175.07255: checking for any_errors_fatal 16380 1727204175.07261: done checking for any_errors_fatal 16380 1727204175.07262: checking for max_fail_percentage 16380 1727204175.07265: done checking for max_fail_percentage 16380 1727204175.07266: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.07267: done checking to see if all hosts have failed 16380 1727204175.07268: getting the remaining hosts for this loop 16380 1727204175.07270: done getting the remaining hosts for this loop 16380 1727204175.07275: getting the next task for host managed-node2 16380 1727204175.07285: done getting next task for host managed-node2 16380 1727204175.07291: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16380 1727204175.07294: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.07307: getting variables 16380 1727204175.07309: in VariableManager get_vars() 16380 1727204175.07356: Calling all_inventory to load vars for managed-node2 16380 1727204175.07360: Calling groups_inventory to load vars for managed-node2 16380 1727204175.07363: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.07375: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.07378: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.07382: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.09877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.12954: done with get_vars() 16380 1727204175.13002: done getting variables 16380 1727204175.13084: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.106) 0:00:36.237 ***** 16380 1727204175.13125: entering _queue_task() for managed-node2/debug 16380 1727204175.13711: worker is 1 (out of 1 available) 16380 1727204175.13724: exiting _queue_task() for managed-node2/debug 16380 1727204175.13736: done queuing things up, now waiting for results queue to drain 16380 1727204175.13738: waiting for pending results... 16380 1727204175.13877: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16380 1727204175.13993: in run() - task 12b410aa-8751-749c-b6eb-00000000004d 16380 1727204175.14010: variable 'ansible_search_path' from source: unknown 16380 1727204175.14013: variable 'ansible_search_path' from source: unknown 16380 1727204175.14057: calling self._execute() 16380 1727204175.14169: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.14177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.14399: variable 'omit' from source: magic vars 16380 1727204175.14668: variable 'ansible_distribution_major_version' from source: facts 16380 1727204175.14682: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204175.14691: variable 'omit' from source: magic vars 16380 1727204175.14751: variable 'omit' from source: magic vars 16380 1727204175.14800: variable 'omit' from source: magic vars 16380 1727204175.14857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204175.14901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204175.14924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204175.14956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204175.14970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204175.15008: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204175.15013: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.15016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.15156: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204175.15177: Set connection var ansible_shell_executable to /bin/sh 16380 1727204175.15185: Set connection var ansible_connection to ssh 16380 1727204175.15193: Set connection var ansible_shell_type to sh 16380 1727204175.15201: Set connection var ansible_pipelining to False 16380 1727204175.15211: Set connection var ansible_timeout to 10 16380 1727204175.15238: variable 'ansible_shell_executable' from source: unknown 16380 1727204175.15242: variable 'ansible_connection' from source: unknown 16380 1727204175.15245: variable 'ansible_module_compression' from source: unknown 16380 1727204175.15248: variable 'ansible_shell_type' from source: unknown 16380 1727204175.15251: variable 'ansible_shell_executable' from source: unknown 16380 1727204175.15256: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.15261: variable 'ansible_pipelining' from source: unknown 16380 1727204175.15264: variable 'ansible_timeout' from source: unknown 16380 1727204175.15274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.15456: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204175.15469: variable 'omit' from source: magic vars 16380 1727204175.15476: starting attempt loop 16380 1727204175.15480: running the handler 16380 1727204175.15545: variable '__network_connections_result' from source: set_fact 16380 1727204175.15697: variable '__network_connections_result' from source: set_fact 16380 1727204175.15782: handler run complete 16380 1727204175.15829: attempt loop complete, returning result 16380 1727204175.15832: _execute() done 16380 1727204175.15835: dumping result to json 16380 1727204175.15841: done dumping result, returning 16380 1727204175.15852: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-749c-b6eb-00000000004d] 16380 1727204175.15858: sending task result for task 12b410aa-8751-749c-b6eb-00000000004d 16380 1727204175.15969: done sending task result for task 12b410aa-8751-749c-b6eb-00000000004d 16380 1727204175.15973: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 16380 1727204175.16074: no more pending results, returning what we have 16380 1727204175.16078: results queue empty 16380 1727204175.16079: checking for any_errors_fatal 16380 1727204175.16088: done checking for any_errors_fatal 16380 1727204175.16091: checking for max_fail_percentage 16380 1727204175.16093: done checking for max_fail_percentage 16380 1727204175.16094: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.16095: done checking to see if all hosts have failed 16380 1727204175.16096: getting the remaining hosts for this loop 16380 1727204175.16098: done getting the remaining hosts for this loop 16380 1727204175.16103: getting the next task for host managed-node2 16380 1727204175.16110: done getting next task for host managed-node2 16380 1727204175.16115: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16380 1727204175.16117: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.16129: getting variables 16380 1727204175.16131: in VariableManager get_vars() 16380 1727204175.16174: Calling all_inventory to load vars for managed-node2 16380 1727204175.16178: Calling groups_inventory to load vars for managed-node2 16380 1727204175.16181: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.16312: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.16317: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.16324: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.18860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.22017: done with get_vars() 16380 1727204175.22066: done getting variables 16380 1727204175.22152: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.090) 0:00:36.328 ***** 16380 1727204175.22199: entering _queue_task() for managed-node2/debug 16380 1727204175.22820: worker is 1 (out of 1 available) 16380 1727204175.22833: exiting _queue_task() for managed-node2/debug 16380 1727204175.22845: done queuing things up, now waiting for results queue to drain 16380 1727204175.22848: waiting for pending results... 16380 1727204175.23109: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16380 1727204175.23115: in run() - task 12b410aa-8751-749c-b6eb-00000000004e 16380 1727204175.23122: variable 'ansible_search_path' from source: unknown 16380 1727204175.23125: variable 'ansible_search_path' from source: unknown 16380 1727204175.23156: calling self._execute() 16380 1727204175.23272: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.23280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.23301: variable 'omit' from source: magic vars 16380 1727204175.23779: variable 'ansible_distribution_major_version' from source: facts 16380 1727204175.23896: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204175.23968: variable 'network_state' from source: role '' defaults 16380 1727204175.23981: Evaluated conditional (network_state != {}): False 16380 1727204175.23985: when evaluation is False, skipping this task 16380 1727204175.23988: _execute() done 16380 1727204175.23995: dumping result to json 16380 1727204175.24001: done dumping result, returning 16380 1727204175.24011: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-749c-b6eb-00000000004e] 16380 1727204175.24020: sending task result for task 12b410aa-8751-749c-b6eb-00000000004e 16380 1727204175.24132: done sending task result for task 12b410aa-8751-749c-b6eb-00000000004e 16380 1727204175.24136: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16380 1727204175.24196: no more pending results, returning what we have 16380 1727204175.24201: results queue empty 16380 1727204175.24202: checking for any_errors_fatal 16380 1727204175.24212: done checking for any_errors_fatal 16380 1727204175.24213: checking for max_fail_percentage 16380 1727204175.24215: done checking for max_fail_percentage 16380 1727204175.24216: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.24217: done checking to see if all hosts have failed 16380 1727204175.24218: getting the remaining hosts for this loop 16380 1727204175.24220: done getting the remaining hosts for this loop 16380 1727204175.24225: getting the next task for host managed-node2 16380 1727204175.24234: done getting next task for host managed-node2 16380 1727204175.24239: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16380 1727204175.24243: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.24260: getting variables 16380 1727204175.24262: in VariableManager get_vars() 16380 1727204175.24419: Calling all_inventory to load vars for managed-node2 16380 1727204175.24423: Calling groups_inventory to load vars for managed-node2 16380 1727204175.24426: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.24440: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.24444: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.24448: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.27108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.30706: done with get_vars() 16380 1727204175.30804: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.087) 0:00:36.416 ***** 16380 1727204175.30948: entering _queue_task() for managed-node2/ping 16380 1727204175.31670: worker is 1 (out of 1 available) 16380 1727204175.31685: exiting _queue_task() for managed-node2/ping 16380 1727204175.31701: done queuing things up, now waiting for results queue to drain 16380 1727204175.31704: waiting for pending results... 16380 1727204175.32413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16380 1727204175.32418: in run() - task 12b410aa-8751-749c-b6eb-00000000004f 16380 1727204175.32423: variable 'ansible_search_path' from source: unknown 16380 1727204175.32426: variable 'ansible_search_path' from source: unknown 16380 1727204175.32429: calling self._execute() 16380 1727204175.32523: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.32528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.32531: variable 'omit' from source: magic vars 16380 1727204175.33331: variable 'ansible_distribution_major_version' from source: facts 16380 1727204175.33351: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204175.33358: variable 'omit' from source: magic vars 16380 1727204175.33415: variable 'omit' from source: magic vars 16380 1727204175.33471: variable 'omit' from source: magic vars 16380 1727204175.33525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204175.33578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204175.33604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204175.33631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204175.33687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204175.33728: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204175.33732: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.33734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.34068: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204175.34072: Set connection var ansible_shell_executable to /bin/sh 16380 1727204175.34075: Set connection var ansible_connection to ssh 16380 1727204175.34078: Set connection var ansible_shell_type to sh 16380 1727204175.34080: Set connection var ansible_pipelining to False 16380 1727204175.34082: Set connection var ansible_timeout to 10 16380 1727204175.34084: variable 'ansible_shell_executable' from source: unknown 16380 1727204175.34087: variable 'ansible_connection' from source: unknown 16380 1727204175.34091: variable 'ansible_module_compression' from source: unknown 16380 1727204175.34093: variable 'ansible_shell_type' from source: unknown 16380 1727204175.34097: variable 'ansible_shell_executable' from source: unknown 16380 1727204175.34099: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204175.34102: variable 'ansible_pipelining' from source: unknown 16380 1727204175.34105: variable 'ansible_timeout' from source: unknown 16380 1727204175.34108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204175.34658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204175.34663: variable 'omit' from source: magic vars 16380 1727204175.34665: starting attempt loop 16380 1727204175.34668: running the handler 16380 1727204175.34670: _low_level_execute_command(): starting 16380 1727204175.34673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204175.35596: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204175.35797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204175.35801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204175.35804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204175.37598: stdout chunk (state=3): >>>/root <<< 16380 1727204175.37706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204175.37881: stderr chunk (state=3): >>><<< 16380 1727204175.37885: stdout chunk (state=3): >>><<< 16380 1727204175.37921: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204175.38096: _low_level_execute_command(): starting 16380 1727204175.38100: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812 `" && echo ansible-tmp-1727204175.379253-18797-169991238725812="` echo /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812 `" ) && sleep 0' 16380 1727204175.38999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.39178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204175.39200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204175.39309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204175.39413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204175.41531: stdout chunk (state=3): >>>ansible-tmp-1727204175.379253-18797-169991238725812=/root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812 <<< 16380 1727204175.41666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204175.41750: stderr chunk (state=3): >>><<< 16380 1727204175.41762: stdout chunk (state=3): >>><<< 16380 1727204175.41807: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204175.379253-18797-169991238725812=/root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204175.41883: variable 'ansible_module_compression' from source: unknown 16380 1727204175.41986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 16380 1727204175.42035: variable 'ansible_facts' from source: unknown 16380 1727204175.42128: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py 16380 1727204175.42472: Sending initial data 16380 1727204175.42483: Sent initial data (152 bytes) 16380 1727204175.43027: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204175.43041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204175.43135: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.43167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204175.43185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204175.43202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204175.43407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204175.45213: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16380 1727204175.45234: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 16380 1727204175.45266: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204175.45331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204175.45402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpai8vea8v /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py <<< 16380 1727204175.45406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py" <<< 16380 1727204175.45437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpai8vea8v" to remote "/root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py" <<< 16380 1727204175.46614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204175.46618: stdout chunk (state=3): >>><<< 16380 1727204175.46621: stderr chunk (state=3): >>><<< 16380 1727204175.46623: done transferring module to remote 16380 1727204175.46625: _low_level_execute_command(): starting 16380 1727204175.46628: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/ /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py && sleep 0' 16380 1727204175.47236: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204175.47240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204175.47242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204175.47245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204175.47247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.47492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204175.47504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204175.47602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204175.49595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204175.49671: stderr chunk (state=3): >>><<< 16380 1727204175.49675: stdout chunk (state=3): >>><<< 16380 1727204175.49697: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204175.49701: _low_level_execute_command(): starting 16380 1727204175.49801: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/AnsiballZ_ping.py && sleep 0' 16380 1727204175.50329: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204175.50346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204175.50351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204175.50368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204175.50382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204175.50393: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204175.50404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.50420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204175.50457: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204175.50463: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16380 1727204175.50469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204175.50472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204175.50474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204175.50567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204175.50571: stderr chunk (state=3): >>>debug2: match found <<< 16380 1727204175.50575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.50582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204175.50598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204175.50625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204175.50705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204175.68504: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16380 1727204175.70104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204175.70108: stdout chunk (state=3): >>><<< 16380 1727204175.70111: stderr chunk (state=3): >>><<< 16380 1727204175.70114: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204175.70116: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204175.70118: _low_level_execute_command(): starting 16380 1727204175.70120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204175.379253-18797-169991238725812/ > /dev/null 2>&1 && sleep 0' 16380 1727204175.70805: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204175.70821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204175.70865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.70891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204175.70976: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204175.71011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204175.71032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204175.71056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204175.71147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204175.73156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204175.73258: stderr chunk (state=3): >>><<< 16380 1727204175.73270: stdout chunk (state=3): >>><<< 16380 1727204175.73308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204175.73321: handler run complete 16380 1727204175.73458: attempt loop complete, returning result 16380 1727204175.73463: _execute() done 16380 1727204175.73466: dumping result to json 16380 1727204175.73469: done dumping result, returning 16380 1727204175.73471: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-749c-b6eb-00000000004f] 16380 1727204175.73473: sending task result for task 12b410aa-8751-749c-b6eb-00000000004f 16380 1727204175.73548: done sending task result for task 12b410aa-8751-749c-b6eb-00000000004f 16380 1727204175.73552: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 16380 1727204175.73630: no more pending results, returning what we have 16380 1727204175.73634: results queue empty 16380 1727204175.73635: checking for any_errors_fatal 16380 1727204175.73646: done checking for any_errors_fatal 16380 1727204175.73647: checking for max_fail_percentage 16380 1727204175.73649: done checking for max_fail_percentage 16380 1727204175.73651: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.73652: done checking to see if all hosts have failed 16380 1727204175.73653: getting the remaining hosts for this loop 16380 1727204175.73655: done getting the remaining hosts for this loop 16380 1727204175.73660: getting the next task for host managed-node2 16380 1727204175.73670: done getting next task for host managed-node2 16380 1727204175.73672: ^ task is: TASK: meta (role_complete) 16380 1727204175.73674: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.73686: getting variables 16380 1727204175.73688: in VariableManager get_vars() 16380 1727204175.73737: Calling all_inventory to load vars for managed-node2 16380 1727204175.73740: Calling groups_inventory to load vars for managed-node2 16380 1727204175.73743: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.73755: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.73758: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.73761: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.75459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.77593: done with get_vars() 16380 1727204175.77633: done getting variables 16380 1727204175.77782: done queuing things up, now waiting for results queue to drain 16380 1727204175.77785: results queue empty 16380 1727204175.77786: checking for any_errors_fatal 16380 1727204175.77792: done checking for any_errors_fatal 16380 1727204175.77793: checking for max_fail_percentage 16380 1727204175.77794: done checking for max_fail_percentage 16380 1727204175.77795: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.77796: done checking to see if all hosts have failed 16380 1727204175.77796: getting the remaining hosts for this loop 16380 1727204175.77797: done getting the remaining hosts for this loop 16380 1727204175.77800: getting the next task for host managed-node2 16380 1727204175.77803: done getting next task for host managed-node2 16380 1727204175.77804: ^ task is: TASK: meta (flush_handlers) 16380 1727204175.77805: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.77808: getting variables 16380 1727204175.77809: in VariableManager get_vars() 16380 1727204175.77822: Calling all_inventory to load vars for managed-node2 16380 1727204175.77824: Calling groups_inventory to load vars for managed-node2 16380 1727204175.77825: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.77830: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.77832: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.77834: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.79026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.80588: done with get_vars() 16380 1727204175.80617: done getting variables 16380 1727204175.80660: in VariableManager get_vars() 16380 1727204175.80670: Calling all_inventory to load vars for managed-node2 16380 1727204175.80672: Calling groups_inventory to load vars for managed-node2 16380 1727204175.80674: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.80678: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.80681: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.80684: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.81932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.86296: done with get_vars() 16380 1727204175.86368: done queuing things up, now waiting for results queue to drain 16380 1727204175.86371: results queue empty 16380 1727204175.86373: checking for any_errors_fatal 16380 1727204175.86374: done checking for any_errors_fatal 16380 1727204175.86375: checking for max_fail_percentage 16380 1727204175.86377: done checking for max_fail_percentage 16380 1727204175.86378: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.86379: done checking to see if all hosts have failed 16380 1727204175.86380: getting the remaining hosts for this loop 16380 1727204175.86381: done getting the remaining hosts for this loop 16380 1727204175.86384: getting the next task for host managed-node2 16380 1727204175.86391: done getting next task for host managed-node2 16380 1727204175.86393: ^ task is: TASK: meta (flush_handlers) 16380 1727204175.86395: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.86399: getting variables 16380 1727204175.86404: in VariableManager get_vars() 16380 1727204175.86421: Calling all_inventory to load vars for managed-node2 16380 1727204175.86424: Calling groups_inventory to load vars for managed-node2 16380 1727204175.86426: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.86439: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.86448: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.86452: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.88586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.91336: done with get_vars() 16380 1727204175.91377: done getting variables 16380 1727204175.91452: in VariableManager get_vars() 16380 1727204175.91468: Calling all_inventory to load vars for managed-node2 16380 1727204175.91471: Calling groups_inventory to load vars for managed-node2 16380 1727204175.91474: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.91480: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.91483: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.91487: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.92700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204175.94782: done with get_vars() 16380 1727204175.94817: done queuing things up, now waiting for results queue to drain 16380 1727204175.94821: results queue empty 16380 1727204175.94822: checking for any_errors_fatal 16380 1727204175.94823: done checking for any_errors_fatal 16380 1727204175.94824: checking for max_fail_percentage 16380 1727204175.94825: done checking for max_fail_percentage 16380 1727204175.94825: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.94826: done checking to see if all hosts have failed 16380 1727204175.94826: getting the remaining hosts for this loop 16380 1727204175.94827: done getting the remaining hosts for this loop 16380 1727204175.94829: getting the next task for host managed-node2 16380 1727204175.94833: done getting next task for host managed-node2 16380 1727204175.94833: ^ task is: None 16380 1727204175.94834: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.94835: done queuing things up, now waiting for results queue to drain 16380 1727204175.94836: results queue empty 16380 1727204175.94837: checking for any_errors_fatal 16380 1727204175.94837: done checking for any_errors_fatal 16380 1727204175.94838: checking for max_fail_percentage 16380 1727204175.94839: done checking for max_fail_percentage 16380 1727204175.94839: checking to see if all hosts have failed and the running result is not ok 16380 1727204175.94840: done checking to see if all hosts have failed 16380 1727204175.94841: getting the next task for host managed-node2 16380 1727204175.94842: done getting next task for host managed-node2 16380 1727204175.94843: ^ task is: None 16380 1727204175.94844: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.94891: in VariableManager get_vars() 16380 1727204175.94915: done with get_vars() 16380 1727204175.94920: in VariableManager get_vars() 16380 1727204175.94928: done with get_vars() 16380 1727204175.94932: variable 'omit' from source: magic vars 16380 1727204175.94956: in VariableManager get_vars() 16380 1727204175.94963: done with get_vars() 16380 1727204175.94980: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 16380 1727204175.95188: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204175.95212: getting the remaining hosts for this loop 16380 1727204175.95214: done getting the remaining hosts for this loop 16380 1727204175.95216: getting the next task for host managed-node2 16380 1727204175.95219: done getting next task for host managed-node2 16380 1727204175.95222: ^ task is: TASK: Gathering Facts 16380 1727204175.95223: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204175.95225: getting variables 16380 1727204175.95226: in VariableManager get_vars() 16380 1727204175.95233: Calling all_inventory to load vars for managed-node2 16380 1727204175.95235: Calling groups_inventory to load vars for managed-node2 16380 1727204175.95237: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204175.95242: Calling all_plugins_play to load vars for managed-node2 16380 1727204175.95244: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204175.95246: Calling groups_plugins_play to load vars for managed-node2 16380 1727204175.96454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204176.00061: done with get_vars() 16380 1727204176.00110: done getting variables 16380 1727204176.00196: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:56:16 -0400 (0:00:00.693) 0:00:37.109 ***** 16380 1727204176.00250: entering _queue_task() for managed-node2/gather_facts 16380 1727204176.00653: worker is 1 (out of 1 available) 16380 1727204176.00668: exiting _queue_task() for managed-node2/gather_facts 16380 1727204176.00684: done queuing things up, now waiting for results queue to drain 16380 1727204176.00687: waiting for pending results... 16380 1727204176.00979: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204176.01114: in run() - task 12b410aa-8751-749c-b6eb-000000000382 16380 1727204176.01154: variable 'ansible_search_path' from source: unknown 16380 1727204176.01186: calling self._execute() 16380 1727204176.01277: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204176.01306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204176.01315: variable 'omit' from source: magic vars 16380 1727204176.01733: variable 'ansible_distribution_major_version' from source: facts 16380 1727204176.01744: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204176.01751: variable 'omit' from source: magic vars 16380 1727204176.01780: variable 'omit' from source: magic vars 16380 1727204176.01815: variable 'omit' from source: magic vars 16380 1727204176.01864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204176.01907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204176.01927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204176.01945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204176.01959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204176.01988: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204176.01991: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204176.01998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204176.02131: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204176.02140: Set connection var ansible_shell_executable to /bin/sh 16380 1727204176.02146: Set connection var ansible_connection to ssh 16380 1727204176.02153: Set connection var ansible_shell_type to sh 16380 1727204176.02159: Set connection var ansible_pipelining to False 16380 1727204176.02168: Set connection var ansible_timeout to 10 16380 1727204176.02200: variable 'ansible_shell_executable' from source: unknown 16380 1727204176.02204: variable 'ansible_connection' from source: unknown 16380 1727204176.02224: variable 'ansible_module_compression' from source: unknown 16380 1727204176.02228: variable 'ansible_shell_type' from source: unknown 16380 1727204176.02231: variable 'ansible_shell_executable' from source: unknown 16380 1727204176.02235: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204176.02238: variable 'ansible_pipelining' from source: unknown 16380 1727204176.02240: variable 'ansible_timeout' from source: unknown 16380 1727204176.02243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204176.02426: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204176.02494: variable 'omit' from source: magic vars 16380 1727204176.02500: starting attempt loop 16380 1727204176.02503: running the handler 16380 1727204176.02506: variable 'ansible_facts' from source: unknown 16380 1727204176.02522: _low_level_execute_command(): starting 16380 1727204176.02542: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204176.03088: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204176.03112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204176.03137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.03185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204176.03191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204176.03194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204176.03238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204176.05027: stdout chunk (state=3): >>>/root <<< 16380 1727204176.05136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204176.05203: stderr chunk (state=3): >>><<< 16380 1727204176.05207: stdout chunk (state=3): >>><<< 16380 1727204176.05231: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204176.05243: _low_level_execute_command(): starting 16380 1727204176.05254: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091 `" && echo ansible-tmp-1727204176.052312-18923-112244126511091="` echo /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091 `" ) && sleep 0' 16380 1727204176.05865: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204176.05910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.05971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204176.05999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204176.06054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204176.08152: stdout chunk (state=3): >>>ansible-tmp-1727204176.052312-18923-112244126511091=/root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091 <<< 16380 1727204176.08288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204176.08350: stderr chunk (state=3): >>><<< 16380 1727204176.08354: stdout chunk (state=3): >>><<< 16380 1727204176.08378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204176.052312-18923-112244126511091=/root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204176.08413: variable 'ansible_module_compression' from source: unknown 16380 1727204176.08492: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204176.08563: variable 'ansible_facts' from source: unknown 16380 1727204176.08694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py 16380 1727204176.08829: Sending initial data 16380 1727204176.08832: Sent initial data (153 bytes) 16380 1727204176.09328: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204176.09332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.09335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204176.09337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.09394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204176.09409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204176.09459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204176.11188: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 16380 1727204176.11191: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204176.11221: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204176.11256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp2keizurx /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py <<< 16380 1727204176.11262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py" <<< 16380 1727204176.11297: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp2keizurx" to remote "/root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py" <<< 16380 1727204176.11301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py" <<< 16380 1727204176.13036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204176.13111: stderr chunk (state=3): >>><<< 16380 1727204176.13114: stdout chunk (state=3): >>><<< 16380 1727204176.13139: done transferring module to remote 16380 1727204176.13152: _low_level_execute_command(): starting 16380 1727204176.13161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/ /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py && sleep 0' 16380 1727204176.13734: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204176.13737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.13796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204176.13801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204176.13870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204176.15787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204176.15864: stderr chunk (state=3): >>><<< 16380 1727204176.15874: stdout chunk (state=3): >>><<< 16380 1727204176.15903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204176.15925: _low_level_execute_command(): starting 16380 1727204176.15929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/AnsiballZ_setup.py && sleep 0' 16380 1727204176.16481: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204176.16484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204176.16487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204176.16496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204176.16498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.16555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204176.16563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204176.16621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204176.86523: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "16", "epoch": "1727204176", "epoch_int": "1727204176", "date": "2024-09-24", "time": "14:56:16", "iso8601_micro": "2024-09-24T18:56:16.485206Z", "iso8601": "2024-09-24T18:56:16Z", "iso8601_basic": "20240924T145616485206", "iso8601_basic_short": "20240924T145616", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2838, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 879, "free": 2838}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 680, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147505664, "block_size": 4096, "block_total": 64479564, "block_available": 61315309, "block_used": 3164255, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.48974609375, "5m": 0.52880859375, "15m": 0.34423828125}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204176.88811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204176.88815: stdout chunk (state=3): >>><<< 16380 1727204176.88818: stderr chunk (state=3): >>><<< 16380 1727204176.88821: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "16", "epoch": "1727204176", "epoch_int": "1727204176", "date": "2024-09-24", "time": "14:56:16", "iso8601_micro": "2024-09-24T18:56:16.485206Z", "iso8601": "2024-09-24T18:56:16Z", "iso8601_basic": "20240924T145616485206", "iso8601_basic_short": "20240924T145616", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2838, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 879, "free": 2838}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 680, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147505664, "block_size": 4096, "block_total": 64479564, "block_available": 61315309, "block_used": 3164255, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.48974609375, "5m": 0.52880859375, "15m": 0.34423828125}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204176.89566: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204176.89835: _low_level_execute_command(): starting 16380 1727204176.89839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204176.052312-18923-112244126511091/ > /dev/null 2>&1 && sleep 0' 16380 1727204176.91202: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204176.91343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204176.91432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204176.91502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204176.91515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204176.91581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204176.93599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204176.93648: stderr chunk (state=3): >>><<< 16380 1727204176.93655: stdout chunk (state=3): >>><<< 16380 1727204176.93675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204176.93687: handler run complete 16380 1727204176.93794: variable 'ansible_facts' from source: unknown 16380 1727204176.93880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204176.94141: variable 'ansible_facts' from source: unknown 16380 1727204176.94226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204176.94494: attempt loop complete, returning result 16380 1727204176.94497: _execute() done 16380 1727204176.94500: dumping result to json 16380 1727204176.94503: done dumping result, returning 16380 1727204176.94505: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-000000000382] 16380 1727204176.94507: sending task result for task 12b410aa-8751-749c-b6eb-000000000382 ok: [managed-node2] 16380 1727204176.95470: no more pending results, returning what we have 16380 1727204176.95474: results queue empty 16380 1727204176.95475: checking for any_errors_fatal 16380 1727204176.95477: done checking for any_errors_fatal 16380 1727204176.95477: checking for max_fail_percentage 16380 1727204176.95479: done checking for max_fail_percentage 16380 1727204176.95480: checking to see if all hosts have failed and the running result is not ok 16380 1727204176.95481: done checking to see if all hosts have failed 16380 1727204176.95482: getting the remaining hosts for this loop 16380 1727204176.95484: done getting the remaining hosts for this loop 16380 1727204176.95487: getting the next task for host managed-node2 16380 1727204176.95648: done getting next task for host managed-node2 16380 1727204176.95651: ^ task is: TASK: meta (flush_handlers) 16380 1727204176.95654: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204176.95666: getting variables 16380 1727204176.95668: in VariableManager get_vars() 16380 1727204176.95743: Calling all_inventory to load vars for managed-node2 16380 1727204176.95747: Calling groups_inventory to load vars for managed-node2 16380 1727204176.95751: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204176.95764: Calling all_plugins_play to load vars for managed-node2 16380 1727204176.95768: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204176.95773: done sending task result for task 12b410aa-8751-749c-b6eb-000000000382 16380 1727204176.95777: WORKER PROCESS EXITING 16380 1727204176.95782: Calling groups_plugins_play to load vars for managed-node2 16380 1727204176.97288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.00676: done with get_vars() 16380 1727204177.00703: done getting variables 16380 1727204177.00773: in VariableManager get_vars() 16380 1727204177.00787: Calling all_inventory to load vars for managed-node2 16380 1727204177.00790: Calling groups_inventory to load vars for managed-node2 16380 1727204177.00793: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.00797: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.00799: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.00802: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.02560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.05724: done with get_vars() 16380 1727204177.05773: done queuing things up, now waiting for results queue to drain 16380 1727204177.05776: results queue empty 16380 1727204177.05777: checking for any_errors_fatal 16380 1727204177.05782: done checking for any_errors_fatal 16380 1727204177.05783: checking for max_fail_percentage 16380 1727204177.05785: done checking for max_fail_percentage 16380 1727204177.05786: checking to see if all hosts have failed and the running result is not ok 16380 1727204177.05787: done checking to see if all hosts have failed 16380 1727204177.05807: getting the remaining hosts for this loop 16380 1727204177.05809: done getting the remaining hosts for this loop 16380 1727204177.05813: getting the next task for host managed-node2 16380 1727204177.05818: done getting next task for host managed-node2 16380 1727204177.05821: ^ task is: TASK: Include the task 'delete_interface.yml' 16380 1727204177.05823: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.05826: getting variables 16380 1727204177.05827: in VariableManager get_vars() 16380 1727204177.05840: Calling all_inventory to load vars for managed-node2 16380 1727204177.05843: Calling groups_inventory to load vars for managed-node2 16380 1727204177.05846: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.05853: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.05857: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.05861: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.08029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.10232: done with get_vars() 16380 1727204177.10278: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:56:17 -0400 (0:00:01.101) 0:00:38.210 ***** 16380 1727204177.10387: entering _queue_task() for managed-node2/include_tasks 16380 1727204177.10787: worker is 1 (out of 1 available) 16380 1727204177.10930: exiting _queue_task() for managed-node2/include_tasks 16380 1727204177.10941: done queuing things up, now waiting for results queue to drain 16380 1727204177.10944: waiting for pending results... 16380 1727204177.11148: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 16380 1727204177.11240: in run() - task 12b410aa-8751-749c-b6eb-000000000052 16380 1727204177.11258: variable 'ansible_search_path' from source: unknown 16380 1727204177.11298: calling self._execute() 16380 1727204177.11391: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.11397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.11408: variable 'omit' from source: magic vars 16380 1727204177.11752: variable 'ansible_distribution_major_version' from source: facts 16380 1727204177.11763: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204177.11769: _execute() done 16380 1727204177.11773: dumping result to json 16380 1727204177.11778: done dumping result, returning 16380 1727204177.11785: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [12b410aa-8751-749c-b6eb-000000000052] 16380 1727204177.11792: sending task result for task 12b410aa-8751-749c-b6eb-000000000052 16380 1727204177.11901: done sending task result for task 12b410aa-8751-749c-b6eb-000000000052 16380 1727204177.11904: WORKER PROCESS EXITING 16380 1727204177.11957: no more pending results, returning what we have 16380 1727204177.11963: in VariableManager get_vars() 16380 1727204177.12001: Calling all_inventory to load vars for managed-node2 16380 1727204177.12005: Calling groups_inventory to load vars for managed-node2 16380 1727204177.12009: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.12024: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.12028: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.12032: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.13967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.21325: done with get_vars() 16380 1727204177.21354: variable 'ansible_search_path' from source: unknown 16380 1727204177.21371: we have included files to process 16380 1727204177.21373: generating all_blocks data 16380 1727204177.21374: done generating all_blocks data 16380 1727204177.21378: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 16380 1727204177.21380: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 16380 1727204177.21383: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 16380 1727204177.21686: done processing included file 16380 1727204177.21692: iterating over new_blocks loaded from include file 16380 1727204177.21694: in VariableManager get_vars() 16380 1727204177.21714: done with get_vars() 16380 1727204177.21721: filtering new block on tags 16380 1727204177.21743: done filtering new block on tags 16380 1727204177.21747: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 16380 1727204177.21752: extending task lists for all hosts with included blocks 16380 1727204177.21807: done extending task lists 16380 1727204177.21809: done processing included files 16380 1727204177.21810: results queue empty 16380 1727204177.21811: checking for any_errors_fatal 16380 1727204177.21813: done checking for any_errors_fatal 16380 1727204177.21813: checking for max_fail_percentage 16380 1727204177.21815: done checking for max_fail_percentage 16380 1727204177.21816: checking to see if all hosts have failed and the running result is not ok 16380 1727204177.21817: done checking to see if all hosts have failed 16380 1727204177.21819: getting the remaining hosts for this loop 16380 1727204177.21821: done getting the remaining hosts for this loop 16380 1727204177.21824: getting the next task for host managed-node2 16380 1727204177.21828: done getting next task for host managed-node2 16380 1727204177.21831: ^ task is: TASK: Remove test interface if necessary 16380 1727204177.21841: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.21844: getting variables 16380 1727204177.21845: in VariableManager get_vars() 16380 1727204177.21857: Calling all_inventory to load vars for managed-node2 16380 1727204177.21860: Calling groups_inventory to load vars for managed-node2 16380 1727204177.21863: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.21874: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.21878: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.21885: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.23920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.25837: done with get_vars() 16380 1727204177.25859: done getting variables 16380 1727204177.25897: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.155) 0:00:38.365 ***** 16380 1727204177.25918: entering _queue_task() for managed-node2/command 16380 1727204177.26256: worker is 1 (out of 1 available) 16380 1727204177.26274: exiting _queue_task() for managed-node2/command 16380 1727204177.26287: done queuing things up, now waiting for results queue to drain 16380 1727204177.26292: waiting for pending results... 16380 1727204177.26610: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 16380 1727204177.26713: in run() - task 12b410aa-8751-749c-b6eb-000000000393 16380 1727204177.26725: variable 'ansible_search_path' from source: unknown 16380 1727204177.26729: variable 'ansible_search_path' from source: unknown 16380 1727204177.26768: calling self._execute() 16380 1727204177.26864: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.26899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.26932: variable 'omit' from source: magic vars 16380 1727204177.27255: variable 'ansible_distribution_major_version' from source: facts 16380 1727204177.27266: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204177.27273: variable 'omit' from source: magic vars 16380 1727204177.27322: variable 'omit' from source: magic vars 16380 1727204177.27409: variable 'interface' from source: set_fact 16380 1727204177.27426: variable 'omit' from source: magic vars 16380 1727204177.27475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204177.27555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204177.27595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204177.27600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204177.27603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204177.27634: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204177.27637: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.27640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.27736: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204177.27743: Set connection var ansible_shell_executable to /bin/sh 16380 1727204177.27750: Set connection var ansible_connection to ssh 16380 1727204177.27757: Set connection var ansible_shell_type to sh 16380 1727204177.27763: Set connection var ansible_pipelining to False 16380 1727204177.27771: Set connection var ansible_timeout to 10 16380 1727204177.27866: variable 'ansible_shell_executable' from source: unknown 16380 1727204177.27870: variable 'ansible_connection' from source: unknown 16380 1727204177.27872: variable 'ansible_module_compression' from source: unknown 16380 1727204177.27885: variable 'ansible_shell_type' from source: unknown 16380 1727204177.27904: variable 'ansible_shell_executable' from source: unknown 16380 1727204177.27909: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.27911: variable 'ansible_pipelining' from source: unknown 16380 1727204177.27914: variable 'ansible_timeout' from source: unknown 16380 1727204177.27924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.28011: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204177.28042: variable 'omit' from source: magic vars 16380 1727204177.28047: starting attempt loop 16380 1727204177.28050: running the handler 16380 1727204177.28052: _low_level_execute_command(): starting 16380 1727204177.28062: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204177.28769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204177.28773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.28809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204177.28814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.28834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204177.28846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.28954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204177.30739: stdout chunk (state=3): >>>/root <<< 16380 1727204177.30885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204177.30940: stderr chunk (state=3): >>><<< 16380 1727204177.30943: stdout chunk (state=3): >>><<< 16380 1727204177.30987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204177.30993: _low_level_execute_command(): starting 16380 1727204177.30999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565 `" && echo ansible-tmp-1727204177.3097522-19006-167770916817565="` echo /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565 `" ) && sleep 0' 16380 1727204177.31736: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.31743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204177.31753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.31849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204177.31853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.31894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204177.34031: stdout chunk (state=3): >>>ansible-tmp-1727204177.3097522-19006-167770916817565=/root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565 <<< 16380 1727204177.34183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204177.34214: stderr chunk (state=3): >>><<< 16380 1727204177.34216: stdout chunk (state=3): >>><<< 16380 1727204177.34296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204177.3097522-19006-167770916817565=/root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204177.34299: variable 'ansible_module_compression' from source: unknown 16380 1727204177.34312: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16380 1727204177.34355: variable 'ansible_facts' from source: unknown 16380 1727204177.34411: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py 16380 1727204177.34529: Sending initial data 16380 1727204177.34532: Sent initial data (156 bytes) 16380 1727204177.35021: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204177.35024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.35028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204177.35030: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.35095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204177.35101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.35153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204177.36873: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 16380 1727204177.36877: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204177.36936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204177.36985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp0vtmm6at /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py <<< 16380 1727204177.36994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py" <<< 16380 1727204177.37023: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp0vtmm6at" to remote "/root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py" <<< 16380 1727204177.37030: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py" <<< 16380 1727204177.38120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204177.38229: stderr chunk (state=3): >>><<< 16380 1727204177.38232: stdout chunk (state=3): >>><<< 16380 1727204177.38251: done transferring module to remote 16380 1727204177.38263: _low_level_execute_command(): starting 16380 1727204177.38273: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/ /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py && sleep 0' 16380 1727204177.38914: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.38957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204177.38976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204177.38993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.39043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204177.41016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204177.41064: stderr chunk (state=3): >>><<< 16380 1727204177.41072: stdout chunk (state=3): >>><<< 16380 1727204177.41122: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204177.41126: _low_level_execute_command(): starting 16380 1727204177.41128: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/AnsiballZ_command.py && sleep 0' 16380 1727204177.41752: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204177.41756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.41788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.41845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204177.60342: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:56:17.594040", "end": "2024-09-24 14:56:17.602243", "delta": "0:00:00.008203", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16380 1727204177.62077: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 16380 1727204177.62138: stderr chunk (state=3): >>><<< 16380 1727204177.62142: stdout chunk (state=3): >>><<< 16380 1727204177.62161: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:56:17.594040", "end": "2024-09-24 14:56:17.602243", "delta": "0:00:00.008203", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 16380 1727204177.62276: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204177.62280: _low_level_execute_command(): starting 16380 1727204177.62282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204177.3097522-19006-167770916817565/ > /dev/null 2>&1 && sleep 0' 16380 1727204177.62675: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204177.62692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204177.62706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.62758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204177.62778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.62815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204177.64897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204177.64955: stderr chunk (state=3): >>><<< 16380 1727204177.65198: stdout chunk (state=3): >>><<< 16380 1727204177.65202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204177.65205: handler run complete 16380 1727204177.65208: Evaluated conditional (False): False 16380 1727204177.65210: attempt loop complete, returning result 16380 1727204177.65212: _execute() done 16380 1727204177.65214: dumping result to json 16380 1727204177.65216: done dumping result, returning 16380 1727204177.65218: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [12b410aa-8751-749c-b6eb-000000000393] 16380 1727204177.65220: sending task result for task 12b410aa-8751-749c-b6eb-000000000393 16380 1727204177.65302: done sending task result for task 12b410aa-8751-749c-b6eb-000000000393 16380 1727204177.65305: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.008203", "end": "2024-09-24 14:56:17.602243", "rc": 1, "start": "2024-09-24 14:56:17.594040" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 16380 1727204177.65399: no more pending results, returning what we have 16380 1727204177.65403: results queue empty 16380 1727204177.65404: checking for any_errors_fatal 16380 1727204177.65406: done checking for any_errors_fatal 16380 1727204177.65407: checking for max_fail_percentage 16380 1727204177.65409: done checking for max_fail_percentage 16380 1727204177.65410: checking to see if all hosts have failed and the running result is not ok 16380 1727204177.65411: done checking to see if all hosts have failed 16380 1727204177.65412: getting the remaining hosts for this loop 16380 1727204177.65415: done getting the remaining hosts for this loop 16380 1727204177.65419: getting the next task for host managed-node2 16380 1727204177.65431: done getting next task for host managed-node2 16380 1727204177.65434: ^ task is: TASK: meta (flush_handlers) 16380 1727204177.65436: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.65442: getting variables 16380 1727204177.65444: in VariableManager get_vars() 16380 1727204177.65480: Calling all_inventory to load vars for managed-node2 16380 1727204177.65483: Calling groups_inventory to load vars for managed-node2 16380 1727204177.65487: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.65510: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.65514: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.65518: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.68280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.71547: done with get_vars() 16380 1727204177.71594: done getting variables 16380 1727204177.71687: in VariableManager get_vars() 16380 1727204177.71703: Calling all_inventory to load vars for managed-node2 16380 1727204177.71706: Calling groups_inventory to load vars for managed-node2 16380 1727204177.71717: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.71724: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.71728: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.71732: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.73862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.77192: done with get_vars() 16380 1727204177.77236: done queuing things up, now waiting for results queue to drain 16380 1727204177.77239: results queue empty 16380 1727204177.77240: checking for any_errors_fatal 16380 1727204177.77246: done checking for any_errors_fatal 16380 1727204177.77247: checking for max_fail_percentage 16380 1727204177.77249: done checking for max_fail_percentage 16380 1727204177.77250: checking to see if all hosts have failed and the running result is not ok 16380 1727204177.77251: done checking to see if all hosts have failed 16380 1727204177.77252: getting the remaining hosts for this loop 16380 1727204177.77253: done getting the remaining hosts for this loop 16380 1727204177.77256: getting the next task for host managed-node2 16380 1727204177.77270: done getting next task for host managed-node2 16380 1727204177.77272: ^ task is: TASK: meta (flush_handlers) 16380 1727204177.77274: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.77278: getting variables 16380 1727204177.77280: in VariableManager get_vars() 16380 1727204177.77295: Calling all_inventory to load vars for managed-node2 16380 1727204177.77298: Calling groups_inventory to load vars for managed-node2 16380 1727204177.77301: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.77309: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.77312: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.77317: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.79485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.82397: done with get_vars() 16380 1727204177.82438: done getting variables 16380 1727204177.82503: in VariableManager get_vars() 16380 1727204177.82515: Calling all_inventory to load vars for managed-node2 16380 1727204177.82518: Calling groups_inventory to load vars for managed-node2 16380 1727204177.82521: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.82528: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.82531: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.82534: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.84582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.88280: done with get_vars() 16380 1727204177.88331: done queuing things up, now waiting for results queue to drain 16380 1727204177.88334: results queue empty 16380 1727204177.88335: checking for any_errors_fatal 16380 1727204177.88343: done checking for any_errors_fatal 16380 1727204177.88344: checking for max_fail_percentage 16380 1727204177.88345: done checking for max_fail_percentage 16380 1727204177.88346: checking to see if all hosts have failed and the running result is not ok 16380 1727204177.88347: done checking to see if all hosts have failed 16380 1727204177.88348: getting the remaining hosts for this loop 16380 1727204177.88350: done getting the remaining hosts for this loop 16380 1727204177.88354: getting the next task for host managed-node2 16380 1727204177.88358: done getting next task for host managed-node2 16380 1727204177.88359: ^ task is: None 16380 1727204177.88361: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.88363: done queuing things up, now waiting for results queue to drain 16380 1727204177.88364: results queue empty 16380 1727204177.88365: checking for any_errors_fatal 16380 1727204177.88366: done checking for any_errors_fatal 16380 1727204177.88367: checking for max_fail_percentage 16380 1727204177.88368: done checking for max_fail_percentage 16380 1727204177.88369: checking to see if all hosts have failed and the running result is not ok 16380 1727204177.88370: done checking to see if all hosts have failed 16380 1727204177.88371: getting the next task for host managed-node2 16380 1727204177.88374: done getting next task for host managed-node2 16380 1727204177.88375: ^ task is: None 16380 1727204177.88377: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.88428: in VariableManager get_vars() 16380 1727204177.88457: done with get_vars() 16380 1727204177.88464: in VariableManager get_vars() 16380 1727204177.88481: done with get_vars() 16380 1727204177.88486: variable 'omit' from source: magic vars 16380 1727204177.88632: variable 'profile' from source: play vars 16380 1727204177.88849: in VariableManager get_vars() 16380 1727204177.88867: done with get_vars() 16380 1727204177.88896: variable 'omit' from source: magic vars 16380 1727204177.88978: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 16380 1727204177.90609: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204177.90642: getting the remaining hosts for this loop 16380 1727204177.90647: done getting the remaining hosts for this loop 16380 1727204177.90650: getting the next task for host managed-node2 16380 1727204177.90654: done getting next task for host managed-node2 16380 1727204177.90656: ^ task is: TASK: Gathering Facts 16380 1727204177.90658: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204177.90661: getting variables 16380 1727204177.90662: in VariableManager get_vars() 16380 1727204177.90699: Calling all_inventory to load vars for managed-node2 16380 1727204177.90703: Calling groups_inventory to load vars for managed-node2 16380 1727204177.90706: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204177.90713: Calling all_plugins_play to load vars for managed-node2 16380 1727204177.90716: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204177.90724: Calling groups_plugins_play to load vars for managed-node2 16380 1727204177.92024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204177.94840: done with get_vars() 16380 1727204177.94911: done getting variables 16380 1727204177.95077: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.692) 0:00:39.057 ***** 16380 1727204177.95135: entering _queue_task() for managed-node2/gather_facts 16380 1727204177.96024: worker is 1 (out of 1 available) 16380 1727204177.96040: exiting _queue_task() for managed-node2/gather_facts 16380 1727204177.96052: done queuing things up, now waiting for results queue to drain 16380 1727204177.96054: waiting for pending results... 16380 1727204177.96353: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204177.96445: in run() - task 12b410aa-8751-749c-b6eb-0000000003a1 16380 1727204177.96490: variable 'ansible_search_path' from source: unknown 16380 1727204177.96549: calling self._execute() 16380 1727204177.96742: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.96775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.96809: variable 'omit' from source: magic vars 16380 1727204177.97537: variable 'ansible_distribution_major_version' from source: facts 16380 1727204177.97594: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204177.97598: variable 'omit' from source: magic vars 16380 1727204177.97622: variable 'omit' from source: magic vars 16380 1727204177.97715: variable 'omit' from source: magic vars 16380 1727204177.97795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204177.97938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204177.97942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204177.97952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204177.97970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204177.98010: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204177.98021: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.98034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.98240: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204177.98258: Set connection var ansible_shell_executable to /bin/sh 16380 1727204177.98271: Set connection var ansible_connection to ssh 16380 1727204177.98282: Set connection var ansible_shell_type to sh 16380 1727204177.98295: Set connection var ansible_pipelining to False 16380 1727204177.98309: Set connection var ansible_timeout to 10 16380 1727204177.98359: variable 'ansible_shell_executable' from source: unknown 16380 1727204177.98362: variable 'ansible_connection' from source: unknown 16380 1727204177.98364: variable 'ansible_module_compression' from source: unknown 16380 1727204177.98367: variable 'ansible_shell_type' from source: unknown 16380 1727204177.98374: variable 'ansible_shell_executable' from source: unknown 16380 1727204177.98467: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204177.98470: variable 'ansible_pipelining' from source: unknown 16380 1727204177.98472: variable 'ansible_timeout' from source: unknown 16380 1727204177.98475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204177.98634: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204177.98652: variable 'omit' from source: magic vars 16380 1727204177.98663: starting attempt loop 16380 1727204177.98670: running the handler 16380 1727204177.98698: variable 'ansible_facts' from source: unknown 16380 1727204177.98729: _low_level_execute_command(): starting 16380 1727204177.98742: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204177.99553: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204177.99573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204177.99607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204177.99674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204177.99737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204177.99753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204177.99798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204177.99870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204178.01730: stdout chunk (state=3): >>>/root <<< 16380 1727204178.01966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204178.02031: stderr chunk (state=3): >>><<< 16380 1727204178.02133: stdout chunk (state=3): >>><<< 16380 1727204178.02224: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204178.02420: _low_level_execute_command(): starting 16380 1727204178.02424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929 `" && echo ansible-tmp-1727204178.022524-19031-250723078072929="` echo /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929 `" ) && sleep 0' 16380 1727204178.03673: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204178.03687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204178.03807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204178.03910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204178.03913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204178.04036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204178.04057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204178.04110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204178.04180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204178.04326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204178.06413: stdout chunk (state=3): >>>ansible-tmp-1727204178.022524-19031-250723078072929=/root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929 <<< 16380 1727204178.06651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204178.06655: stdout chunk (state=3): >>><<< 16380 1727204178.06658: stderr chunk (state=3): >>><<< 16380 1727204178.06802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204178.022524-19031-250723078072929=/root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204178.06805: variable 'ansible_module_compression' from source: unknown 16380 1727204178.06822: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204178.06899: variable 'ansible_facts' from source: unknown 16380 1727204178.07130: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py 16380 1727204178.07384: Sending initial data 16380 1727204178.07399: Sent initial data (153 bytes) 16380 1727204178.08018: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204178.08087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204178.08123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204178.08169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204178.08254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204178.10016: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204178.10068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204178.10152: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpd_tz000t /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py <<< 16380 1727204178.10156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py" <<< 16380 1727204178.10314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpd_tz000t" to remote "/root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py" <<< 16380 1727204178.13167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204178.13228: stderr chunk (state=3): >>><<< 16380 1727204178.13261: stdout chunk (state=3): >>><<< 16380 1727204178.13302: done transferring module to remote 16380 1727204178.13353: _low_level_execute_command(): starting 16380 1727204178.13364: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/ /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py && sleep 0' 16380 1727204178.14236: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204178.14257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204178.14330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204178.14345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204178.14421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204178.14448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204178.14833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204178.14836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204178.16866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204178.16887: stdout chunk (state=3): >>><<< 16380 1727204178.16906: stderr chunk (state=3): >>><<< 16380 1727204178.16932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204178.16941: _low_level_execute_command(): starting 16380 1727204178.16951: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/AnsiballZ_setup.py && sleep 0' 16380 1727204178.17641: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204178.17710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204178.17714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204178.17796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204178.17851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204178.17887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204178.17976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204178.87005: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "18", "epoch": "1727204178", "epoch_int": "1727204178", "date": "2024-09-24", "time": "14:56:18", "iso8601_micro": "2024-09-24T18:56:18.491078Z", "iso8601": "2024-09-24T18:56:18Z", "iso8601_basic": "20240924T145618491078", "iso8601_basic_short": "20240924T145618", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_fips": false, "ansible_loadavg": {"1m": 0.48974609375, "5m": 0.52880859375, "15m": 0.34423828125}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2841, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 876, "free": 2841}, "nocache": {"free": 3470, "used": 247}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 682, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147489280, "block_size": 4096, "block_total": 64479564, "block_available": 61315305, "block_used": 3164259, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204178.89348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204178.89412: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 16380 1727204178.89708: stderr chunk (state=3): >>><<< 16380 1727204178.89712: stdout chunk (state=3): >>><<< 16380 1727204178.89716: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "18", "epoch": "1727204178", "epoch_int": "1727204178", "date": "2024-09-24", "time": "14:56:18", "iso8601_micro": "2024-09-24T18:56:18.491078Z", "iso8601": "2024-09-24T18:56:18Z", "iso8601_basic": "20240924T145618491078", "iso8601_basic_short": "20240924T145618", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_fips": false, "ansible_loadavg": {"1m": 0.48974609375, "5m": 0.52880859375, "15m": 0.34423828125}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2841, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 876, "free": 2841}, "nocache": {"free": 3470, "used": 247}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 682, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147489280, "block_size": 4096, "block_total": 64479564, "block_available": 61315305, "block_used": 3164259, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204178.90554: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204178.90628: _low_level_execute_command(): starting 16380 1727204178.90663: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204178.022524-19031-250723078072929/ > /dev/null 2>&1 && sleep 0' 16380 1727204178.91921: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204178.91973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204178.92079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204178.92127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204178.92195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204178.92210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204178.94331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204178.94471: stderr chunk (state=3): >>><<< 16380 1727204178.94475: stdout chunk (state=3): >>><<< 16380 1727204178.94478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204178.94480: handler run complete 16380 1727204178.95015: variable 'ansible_facts' from source: unknown 16380 1727204178.95209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204178.96168: variable 'ansible_facts' from source: unknown 16380 1727204178.96448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204178.96873: attempt loop complete, returning result 16380 1727204178.96884: _execute() done 16380 1727204178.97044: dumping result to json 16380 1727204178.97048: done dumping result, returning 16380 1727204178.97050: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-0000000003a1] 16380 1727204178.97053: sending task result for task 12b410aa-8751-749c-b6eb-0000000003a1 ok: [managed-node2] 16380 1727204178.98886: done sending task result for task 12b410aa-8751-749c-b6eb-0000000003a1 16380 1727204178.98893: WORKER PROCESS EXITING 16380 1727204178.98931: no more pending results, returning what we have 16380 1727204178.98934: results queue empty 16380 1727204178.98936: checking for any_errors_fatal 16380 1727204178.98937: done checking for any_errors_fatal 16380 1727204178.98938: checking for max_fail_percentage 16380 1727204178.98940: done checking for max_fail_percentage 16380 1727204178.98941: checking to see if all hosts have failed and the running result is not ok 16380 1727204178.98942: done checking to see if all hosts have failed 16380 1727204178.98943: getting the remaining hosts for this loop 16380 1727204178.98945: done getting the remaining hosts for this loop 16380 1727204178.98950: getting the next task for host managed-node2 16380 1727204178.98956: done getting next task for host managed-node2 16380 1727204178.98959: ^ task is: TASK: meta (flush_handlers) 16380 1727204178.98961: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204178.98965: getting variables 16380 1727204178.98967: in VariableManager get_vars() 16380 1727204178.99113: Calling all_inventory to load vars for managed-node2 16380 1727204178.99116: Calling groups_inventory to load vars for managed-node2 16380 1727204178.99122: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204178.99134: Calling all_plugins_play to load vars for managed-node2 16380 1727204178.99138: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204178.99142: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.03688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.10873: done with get_vars() 16380 1727204179.10929: done getting variables 16380 1727204179.11232: in VariableManager get_vars() 16380 1727204179.11251: Calling all_inventory to load vars for managed-node2 16380 1727204179.11255: Calling groups_inventory to load vars for managed-node2 16380 1727204179.11257: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.11264: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.11267: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.11271: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.15638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.21221: done with get_vars() 16380 1727204179.21281: done queuing things up, now waiting for results queue to drain 16380 1727204179.21283: results queue empty 16380 1727204179.21284: checking for any_errors_fatal 16380 1727204179.21292: done checking for any_errors_fatal 16380 1727204179.21293: checking for max_fail_percentage 16380 1727204179.21294: done checking for max_fail_percentage 16380 1727204179.21295: checking to see if all hosts have failed and the running result is not ok 16380 1727204179.21296: done checking to see if all hosts have failed 16380 1727204179.21301: getting the remaining hosts for this loop 16380 1727204179.21302: done getting the remaining hosts for this loop 16380 1727204179.21305: getting the next task for host managed-node2 16380 1727204179.21310: done getting next task for host managed-node2 16380 1727204179.21313: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16380 1727204179.21315: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204179.21328: getting variables 16380 1727204179.21329: in VariableManager get_vars() 16380 1727204179.21347: Calling all_inventory to load vars for managed-node2 16380 1727204179.21350: Calling groups_inventory to load vars for managed-node2 16380 1727204179.21353: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.21363: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.21366: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.21370: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.23551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.26642: done with get_vars() 16380 1727204179.26693: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:19 -0400 (0:00:01.316) 0:00:40.374 ***** 16380 1727204179.26801: entering _queue_task() for managed-node2/include_tasks 16380 1727204179.27415: worker is 1 (out of 1 available) 16380 1727204179.27430: exiting _queue_task() for managed-node2/include_tasks 16380 1727204179.27443: done queuing things up, now waiting for results queue to drain 16380 1727204179.27445: waiting for pending results... 16380 1727204179.27675: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 16380 1727204179.27829: in run() - task 12b410aa-8751-749c-b6eb-00000000005a 16380 1727204179.27855: variable 'ansible_search_path' from source: unknown 16380 1727204179.27864: variable 'ansible_search_path' from source: unknown 16380 1727204179.27923: calling self._execute() 16380 1727204179.28096: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.28100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.28104: variable 'omit' from source: magic vars 16380 1727204179.28571: variable 'ansible_distribution_major_version' from source: facts 16380 1727204179.28595: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204179.28608: _execute() done 16380 1727204179.28616: dumping result to json 16380 1727204179.28627: done dumping result, returning 16380 1727204179.28638: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-749c-b6eb-00000000005a] 16380 1727204179.28648: sending task result for task 12b410aa-8751-749c-b6eb-00000000005a 16380 1727204179.28855: done sending task result for task 12b410aa-8751-749c-b6eb-00000000005a 16380 1727204179.28858: WORKER PROCESS EXITING 16380 1727204179.28908: no more pending results, returning what we have 16380 1727204179.28913: in VariableManager get_vars() 16380 1727204179.28966: Calling all_inventory to load vars for managed-node2 16380 1727204179.28969: Calling groups_inventory to load vars for managed-node2 16380 1727204179.28972: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.28987: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.28993: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.28997: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.31512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.34642: done with get_vars() 16380 1727204179.34680: variable 'ansible_search_path' from source: unknown 16380 1727204179.34682: variable 'ansible_search_path' from source: unknown 16380 1727204179.34720: we have included files to process 16380 1727204179.34721: generating all_blocks data 16380 1727204179.34723: done generating all_blocks data 16380 1727204179.34724: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204179.34725: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204179.34728: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 16380 1727204179.36058: done processing included file 16380 1727204179.36061: iterating over new_blocks loaded from include file 16380 1727204179.36063: in VariableManager get_vars() 16380 1727204179.36094: done with get_vars() 16380 1727204179.36097: filtering new block on tags 16380 1727204179.36122: done filtering new block on tags 16380 1727204179.36125: in VariableManager get_vars() 16380 1727204179.36228: done with get_vars() 16380 1727204179.36230: filtering new block on tags 16380 1727204179.36372: done filtering new block on tags 16380 1727204179.36376: in VariableManager get_vars() 16380 1727204179.36405: done with get_vars() 16380 1727204179.36408: filtering new block on tags 16380 1727204179.36435: done filtering new block on tags 16380 1727204179.36438: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 16380 1727204179.36444: extending task lists for all hosts with included blocks 16380 1727204179.37798: done extending task lists 16380 1727204179.37800: done processing included files 16380 1727204179.37801: results queue empty 16380 1727204179.37802: checking for any_errors_fatal 16380 1727204179.37804: done checking for any_errors_fatal 16380 1727204179.37805: checking for max_fail_percentage 16380 1727204179.37806: done checking for max_fail_percentage 16380 1727204179.37807: checking to see if all hosts have failed and the running result is not ok 16380 1727204179.37808: done checking to see if all hosts have failed 16380 1727204179.37809: getting the remaining hosts for this loop 16380 1727204179.37811: done getting the remaining hosts for this loop 16380 1727204179.37814: getting the next task for host managed-node2 16380 1727204179.37821: done getting next task for host managed-node2 16380 1727204179.37825: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16380 1727204179.37828: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204179.37839: getting variables 16380 1727204179.37840: in VariableManager get_vars() 16380 1727204179.37859: Calling all_inventory to load vars for managed-node2 16380 1727204179.37862: Calling groups_inventory to load vars for managed-node2 16380 1727204179.37865: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.37872: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.37876: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.37880: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.42324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.45495: done with get_vars() 16380 1727204179.45538: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.188) 0:00:40.562 ***** 16380 1727204179.45637: entering _queue_task() for managed-node2/setup 16380 1727204179.46224: worker is 1 (out of 1 available) 16380 1727204179.46236: exiting _queue_task() for managed-node2/setup 16380 1727204179.46247: done queuing things up, now waiting for results queue to drain 16380 1727204179.46249: waiting for pending results... 16380 1727204179.46488: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 16380 1727204179.46495: in run() - task 12b410aa-8751-749c-b6eb-0000000003e2 16380 1727204179.46516: variable 'ansible_search_path' from source: unknown 16380 1727204179.46529: variable 'ansible_search_path' from source: unknown 16380 1727204179.46579: calling self._execute() 16380 1727204179.46698: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.46711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.46731: variable 'omit' from source: magic vars 16380 1727204179.47462: variable 'ansible_distribution_major_version' from source: facts 16380 1727204179.47467: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204179.47879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204179.51269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204179.51364: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204179.51421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204179.51472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204179.51513: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204179.51623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204179.51666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204179.51709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204179.51772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204179.51800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204179.51883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204179.51923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204179.51965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204179.52043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204179.52047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204179.52253: variable '__network_required_facts' from source: role '' defaults 16380 1727204179.52398: variable 'ansible_facts' from source: unknown 16380 1727204179.53582: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 16380 1727204179.53597: when evaluation is False, skipping this task 16380 1727204179.53606: _execute() done 16380 1727204179.53614: dumping result to json 16380 1727204179.53625: done dumping result, returning 16380 1727204179.53636: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-749c-b6eb-0000000003e2] 16380 1727204179.53646: sending task result for task 12b410aa-8751-749c-b6eb-0000000003e2 16380 1727204179.53895: done sending task result for task 12b410aa-8751-749c-b6eb-0000000003e2 16380 1727204179.53899: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204179.53953: no more pending results, returning what we have 16380 1727204179.53957: results queue empty 16380 1727204179.53958: checking for any_errors_fatal 16380 1727204179.53960: done checking for any_errors_fatal 16380 1727204179.53961: checking for max_fail_percentage 16380 1727204179.53963: done checking for max_fail_percentage 16380 1727204179.53964: checking to see if all hosts have failed and the running result is not ok 16380 1727204179.53965: done checking to see if all hosts have failed 16380 1727204179.53966: getting the remaining hosts for this loop 16380 1727204179.53968: done getting the remaining hosts for this loop 16380 1727204179.53972: getting the next task for host managed-node2 16380 1727204179.53984: done getting next task for host managed-node2 16380 1727204179.53988: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 16380 1727204179.53995: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204179.54012: getting variables 16380 1727204179.54014: in VariableManager get_vars() 16380 1727204179.54065: Calling all_inventory to load vars for managed-node2 16380 1727204179.54068: Calling groups_inventory to load vars for managed-node2 16380 1727204179.54072: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.54085: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.54293: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.54300: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.58267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.62839: done with get_vars() 16380 1727204179.62886: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.177) 0:00:40.740 ***** 16380 1727204179.63387: entering _queue_task() for managed-node2/stat 16380 1727204179.63974: worker is 1 (out of 1 available) 16380 1727204179.64192: exiting _queue_task() for managed-node2/stat 16380 1727204179.64205: done queuing things up, now waiting for results queue to drain 16380 1727204179.64207: waiting for pending results... 16380 1727204179.64880: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 16380 1727204179.65699: in run() - task 12b410aa-8751-749c-b6eb-0000000003e4 16380 1727204179.65703: variable 'ansible_search_path' from source: unknown 16380 1727204179.65707: variable 'ansible_search_path' from source: unknown 16380 1727204179.65710: calling self._execute() 16380 1727204179.65993: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.66183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.66188: variable 'omit' from source: magic vars 16380 1727204179.66886: variable 'ansible_distribution_major_version' from source: facts 16380 1727204179.66910: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204179.67139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204179.67479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204179.67545: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204179.67693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204179.67698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204179.67754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204179.67794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204179.67841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204179.67879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204179.67992: variable '__network_is_ostree' from source: set_fact 16380 1727204179.68006: Evaluated conditional (not __network_is_ostree is defined): False 16380 1727204179.68013: when evaluation is False, skipping this task 16380 1727204179.68022: _execute() done 16380 1727204179.68033: dumping result to json 16380 1727204179.68041: done dumping result, returning 16380 1727204179.68051: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-749c-b6eb-0000000003e4] 16380 1727204179.68094: sending task result for task 12b410aa-8751-749c-b6eb-0000000003e4 16380 1727204179.68334: done sending task result for task 12b410aa-8751-749c-b6eb-0000000003e4 16380 1727204179.68338: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16380 1727204179.68397: no more pending results, returning what we have 16380 1727204179.68402: results queue empty 16380 1727204179.68403: checking for any_errors_fatal 16380 1727204179.68412: done checking for any_errors_fatal 16380 1727204179.68413: checking for max_fail_percentage 16380 1727204179.68414: done checking for max_fail_percentage 16380 1727204179.68415: checking to see if all hosts have failed and the running result is not ok 16380 1727204179.68416: done checking to see if all hosts have failed 16380 1727204179.68420: getting the remaining hosts for this loop 16380 1727204179.68423: done getting the remaining hosts for this loop 16380 1727204179.68427: getting the next task for host managed-node2 16380 1727204179.68435: done getting next task for host managed-node2 16380 1727204179.68439: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16380 1727204179.68442: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204179.68459: getting variables 16380 1727204179.68461: in VariableManager get_vars() 16380 1727204179.68508: Calling all_inventory to load vars for managed-node2 16380 1727204179.68511: Calling groups_inventory to load vars for managed-node2 16380 1727204179.68514: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.68528: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.68532: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.68535: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.71485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.78035: done with get_vars() 16380 1727204179.78073: done getting variables 16380 1727204179.78242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.148) 0:00:40.889 ***** 16380 1727204179.78283: entering _queue_task() for managed-node2/set_fact 16380 1727204179.79140: worker is 1 (out of 1 available) 16380 1727204179.79153: exiting _queue_task() for managed-node2/set_fact 16380 1727204179.79166: done queuing things up, now waiting for results queue to drain 16380 1727204179.79169: waiting for pending results... 16380 1727204179.79809: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 16380 1727204179.79853: in run() - task 12b410aa-8751-749c-b6eb-0000000003e5 16380 1727204179.79885: variable 'ansible_search_path' from source: unknown 16380 1727204179.79897: variable 'ansible_search_path' from source: unknown 16380 1727204179.79953: calling self._execute() 16380 1727204179.80081: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.80098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.80116: variable 'omit' from source: magic vars 16380 1727204179.80604: variable 'ansible_distribution_major_version' from source: facts 16380 1727204179.80629: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204179.80852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204179.81205: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204179.81269: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204179.81325: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204179.81376: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204179.81579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204179.81583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204179.81586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204179.81624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204179.81743: variable '__network_is_ostree' from source: set_fact 16380 1727204179.81758: Evaluated conditional (not __network_is_ostree is defined): False 16380 1727204179.81766: when evaluation is False, skipping this task 16380 1727204179.81774: _execute() done 16380 1727204179.81782: dumping result to json 16380 1727204179.81795: done dumping result, returning 16380 1727204179.81810: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-749c-b6eb-0000000003e5] 16380 1727204179.81827: sending task result for task 12b410aa-8751-749c-b6eb-0000000003e5 16380 1727204179.82033: done sending task result for task 12b410aa-8751-749c-b6eb-0000000003e5 16380 1727204179.82037: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 16380 1727204179.82101: no more pending results, returning what we have 16380 1727204179.82105: results queue empty 16380 1727204179.82106: checking for any_errors_fatal 16380 1727204179.82117: done checking for any_errors_fatal 16380 1727204179.82118: checking for max_fail_percentage 16380 1727204179.82120: done checking for max_fail_percentage 16380 1727204179.82120: checking to see if all hosts have failed and the running result is not ok 16380 1727204179.82122: done checking to see if all hosts have failed 16380 1727204179.82123: getting the remaining hosts for this loop 16380 1727204179.82125: done getting the remaining hosts for this loop 16380 1727204179.82129: getting the next task for host managed-node2 16380 1727204179.82142: done getting next task for host managed-node2 16380 1727204179.82337: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 16380 1727204179.82341: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204179.82358: getting variables 16380 1727204179.82360: in VariableManager get_vars() 16380 1727204179.82406: Calling all_inventory to load vars for managed-node2 16380 1727204179.82410: Calling groups_inventory to load vars for managed-node2 16380 1727204179.82413: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204179.82424: Calling all_plugins_play to load vars for managed-node2 16380 1727204179.82428: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204179.82432: Calling groups_plugins_play to load vars for managed-node2 16380 1727204179.84739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204179.87771: done with get_vars() 16380 1727204179.87822: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.096) 0:00:40.985 ***** 16380 1727204179.87943: entering _queue_task() for managed-node2/service_facts 16380 1727204179.88525: worker is 1 (out of 1 available) 16380 1727204179.88537: exiting _queue_task() for managed-node2/service_facts 16380 1727204179.88548: done queuing things up, now waiting for results queue to drain 16380 1727204179.88551: waiting for pending results... 16380 1727204179.88794: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 16380 1727204179.88863: in run() - task 12b410aa-8751-749c-b6eb-0000000003e7 16380 1727204179.88896: variable 'ansible_search_path' from source: unknown 16380 1727204179.88905: variable 'ansible_search_path' from source: unknown 16380 1727204179.88950: calling self._execute() 16380 1727204179.89070: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.89085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.89194: variable 'omit' from source: magic vars 16380 1727204179.89570: variable 'ansible_distribution_major_version' from source: facts 16380 1727204179.89592: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204179.89606: variable 'omit' from source: magic vars 16380 1727204179.89688: variable 'omit' from source: magic vars 16380 1727204179.89741: variable 'omit' from source: magic vars 16380 1727204179.89803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204179.89851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204179.89886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204179.89915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204179.89934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204179.89982: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204179.89996: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.90007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.90142: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204179.90198: Set connection var ansible_shell_executable to /bin/sh 16380 1727204179.90203: Set connection var ansible_connection to ssh 16380 1727204179.90206: Set connection var ansible_shell_type to sh 16380 1727204179.90208: Set connection var ansible_pipelining to False 16380 1727204179.90210: Set connection var ansible_timeout to 10 16380 1727204179.90239: variable 'ansible_shell_executable' from source: unknown 16380 1727204179.90248: variable 'ansible_connection' from source: unknown 16380 1727204179.90258: variable 'ansible_module_compression' from source: unknown 16380 1727204179.90266: variable 'ansible_shell_type' from source: unknown 16380 1727204179.90275: variable 'ansible_shell_executable' from source: unknown 16380 1727204179.90309: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204179.90312: variable 'ansible_pipelining' from source: unknown 16380 1727204179.90315: variable 'ansible_timeout' from source: unknown 16380 1727204179.90317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204179.90562: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204179.90595: variable 'omit' from source: magic vars 16380 1727204179.90599: starting attempt loop 16380 1727204179.90602: running the handler 16380 1727204179.90635: _low_level_execute_command(): starting 16380 1727204179.90643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204179.91434: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204179.91494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204179.91518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204179.91664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204179.91669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204179.93737: stdout chunk (state=3): >>>/root <<< 16380 1727204179.93740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204179.93743: stdout chunk (state=3): >>><<< 16380 1727204179.93746: stderr chunk (state=3): >>><<< 16380 1727204179.94033: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204179.94037: _low_level_execute_command(): starting 16380 1727204179.94041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145 `" && echo ansible-tmp-1727204179.93833-19095-20771215525145="` echo /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145 `" ) && sleep 0' 16380 1727204179.95285: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204179.95353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204179.95524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204179.95682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204179.97697: stdout chunk (state=3): >>>ansible-tmp-1727204179.93833-19095-20771215525145=/root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145 <<< 16380 1727204179.97802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204179.97862: stderr chunk (state=3): >>><<< 16380 1727204179.97873: stdout chunk (state=3): >>><<< 16380 1727204179.97995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204179.93833-19095-20771215525145=/root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204179.98018: variable 'ansible_module_compression' from source: unknown 16380 1727204179.98108: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 16380 1727204179.98217: variable 'ansible_facts' from source: unknown 16380 1727204179.98497: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py 16380 1727204179.98717: Sending initial data 16380 1727204179.98727: Sent initial data (159 bytes) 16380 1727204179.99739: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204179.99981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204180.00117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204180.00159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204180.01934: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204180.02140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204180.02145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py" <<< 16380 1727204180.02148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpknrhphxq /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py <<< 16380 1727204180.02327: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpknrhphxq" to remote "/root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py" <<< 16380 1727204180.04496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204180.05097: stderr chunk (state=3): >>><<< 16380 1727204180.05185: stdout chunk (state=3): >>><<< 16380 1727204180.05188: done transferring module to remote 16380 1727204180.05193: _low_level_execute_command(): starting 16380 1727204180.05196: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/ /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py && sleep 0' 16380 1727204180.06436: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204180.06450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204180.06571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204180.06701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204180.06817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204180.06922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204180.08873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204180.08945: stderr chunk (state=3): >>><<< 16380 1727204180.09001: stdout chunk (state=3): >>><<< 16380 1727204180.09047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204180.09054: _low_level_execute_command(): starting 16380 1727204180.09057: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/AnsiballZ_service_facts.py && sleep 0' 16380 1727204180.10531: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204180.10534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204180.10537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204180.10540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204180.10802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204182.14087: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 16380 1727204182.14123: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zr<<< 16380 1727204182.14247: stdout chunk (state=3): >>>am0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 16380 1727204182.15897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204182.16209: stderr chunk (state=3): >>><<< 16380 1727204182.16212: stdout chunk (state=3): >>><<< 16380 1727204182.16218: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204182.18130: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204182.18135: _low_level_execute_command(): starting 16380 1727204182.18138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204179.93833-19095-20771215525145/ > /dev/null 2>&1 && sleep 0' 16380 1727204182.18941: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204182.18962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204182.19023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204182.19064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204182.19179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204182.19232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204182.19451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204182.21435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204182.21819: stderr chunk (state=3): >>><<< 16380 1727204182.21823: stdout chunk (state=3): >>><<< 16380 1727204182.21826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204182.21828: handler run complete 16380 1727204182.22285: variable 'ansible_facts' from source: unknown 16380 1727204182.23080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204182.24630: variable 'ansible_facts' from source: unknown 16380 1727204182.25394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204182.25792: attempt loop complete, returning result 16380 1727204182.26294: _execute() done 16380 1727204182.26298: dumping result to json 16380 1727204182.26301: done dumping result, returning 16380 1727204182.26305: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-749c-b6eb-0000000003e7] 16380 1727204182.26308: sending task result for task 12b410aa-8751-749c-b6eb-0000000003e7 16380 1727204182.29205: done sending task result for task 12b410aa-8751-749c-b6eb-0000000003e7 16380 1727204182.29209: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204182.29519: no more pending results, returning what we have 16380 1727204182.29523: results queue empty 16380 1727204182.29524: checking for any_errors_fatal 16380 1727204182.29527: done checking for any_errors_fatal 16380 1727204182.29528: checking for max_fail_percentage 16380 1727204182.29530: done checking for max_fail_percentage 16380 1727204182.29532: checking to see if all hosts have failed and the running result is not ok 16380 1727204182.29533: done checking to see if all hosts have failed 16380 1727204182.29534: getting the remaining hosts for this loop 16380 1727204182.29537: done getting the remaining hosts for this loop 16380 1727204182.29541: getting the next task for host managed-node2 16380 1727204182.29547: done getting next task for host managed-node2 16380 1727204182.29552: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 16380 1727204182.29555: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204182.29567: getting variables 16380 1727204182.29569: in VariableManager get_vars() 16380 1727204182.29613: Calling all_inventory to load vars for managed-node2 16380 1727204182.29620: Calling groups_inventory to load vars for managed-node2 16380 1727204182.29623: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204182.29637: Calling all_plugins_play to load vars for managed-node2 16380 1727204182.29642: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204182.29649: Calling groups_plugins_play to load vars for managed-node2 16380 1727204182.33682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204182.39728: done with get_vars() 16380 1727204182.39779: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:22 -0400 (0:00:02.523) 0:00:43.509 ***** 16380 1727204182.40318: entering _queue_task() for managed-node2/package_facts 16380 1727204182.41187: worker is 1 (out of 1 available) 16380 1727204182.41204: exiting _queue_task() for managed-node2/package_facts 16380 1727204182.41220: done queuing things up, now waiting for results queue to drain 16380 1727204182.41223: waiting for pending results... 16380 1727204182.41981: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 16380 1727204182.42404: in run() - task 12b410aa-8751-749c-b6eb-0000000003e8 16380 1727204182.42408: variable 'ansible_search_path' from source: unknown 16380 1727204182.42411: variable 'ansible_search_path' from source: unknown 16380 1727204182.42509: calling self._execute() 16380 1727204182.42960: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204182.42967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204182.42971: variable 'omit' from source: magic vars 16380 1727204182.43882: variable 'ansible_distribution_major_version' from source: facts 16380 1727204182.43911: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204182.43969: variable 'omit' from source: magic vars 16380 1727204182.44173: variable 'omit' from source: magic vars 16380 1727204182.44230: variable 'omit' from source: magic vars 16380 1727204182.44337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204182.44440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204182.44524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204182.44552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204182.44615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204182.44827: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204182.44832: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204182.44835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204182.45010: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204182.45068: Set connection var ansible_shell_executable to /bin/sh 16380 1727204182.45099: Set connection var ansible_connection to ssh 16380 1727204182.45140: Set connection var ansible_shell_type to sh 16380 1727204182.45262: Set connection var ansible_pipelining to False 16380 1727204182.45266: Set connection var ansible_timeout to 10 16380 1727204182.45268: variable 'ansible_shell_executable' from source: unknown 16380 1727204182.45271: variable 'ansible_connection' from source: unknown 16380 1727204182.45274: variable 'ansible_module_compression' from source: unknown 16380 1727204182.45276: variable 'ansible_shell_type' from source: unknown 16380 1727204182.45280: variable 'ansible_shell_executable' from source: unknown 16380 1727204182.45282: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204182.45285: variable 'ansible_pipelining' from source: unknown 16380 1727204182.45395: variable 'ansible_timeout' from source: unknown 16380 1727204182.45399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204182.45865: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204182.46028: variable 'omit' from source: magic vars 16380 1727204182.46136: starting attempt loop 16380 1727204182.46140: running the handler 16380 1727204182.46143: _low_level_execute_command(): starting 16380 1727204182.46147: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204182.47794: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204182.47842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204182.47898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204182.47927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204182.48068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204182.49840: stdout chunk (state=3): >>>/root <<< 16380 1727204182.50064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204182.50077: stdout chunk (state=3): >>><<< 16380 1727204182.50283: stderr chunk (state=3): >>><<< 16380 1727204182.50287: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204182.50293: _low_level_execute_command(): starting 16380 1727204182.50296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552 `" && echo ansible-tmp-1727204182.5021052-19296-253077652568552="` echo /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552 `" ) && sleep 0' 16380 1727204182.51430: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204182.51497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204182.51603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204182.51813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204182.51840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204182.53995: stdout chunk (state=3): >>>ansible-tmp-1727204182.5021052-19296-253077652568552=/root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552 <<< 16380 1727204182.54028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204182.54100: stderr chunk (state=3): >>><<< 16380 1727204182.54110: stdout chunk (state=3): >>><<< 16380 1727204182.54138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204182.5021052-19296-253077652568552=/root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204182.54197: variable 'ansible_module_compression' from source: unknown 16380 1727204182.54352: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 16380 1727204182.54515: variable 'ansible_facts' from source: unknown 16380 1727204182.55116: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py 16380 1727204182.55361: Sending initial data 16380 1727204182.55365: Sent initial data (162 bytes) 16380 1727204182.56573: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204182.56707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204182.56793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204182.56868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204182.58666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204182.58774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py" <<< 16380 1727204182.58778: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpzhc1r1vl /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py <<< 16380 1727204182.58781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpzhc1r1vl" to remote "/root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py" <<< 16380 1727204182.63530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204182.63534: stderr chunk (state=3): >>><<< 16380 1727204182.63537: stdout chunk (state=3): >>><<< 16380 1727204182.63539: done transferring module to remote 16380 1727204182.63553: _low_level_execute_command(): starting 16380 1727204182.63565: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/ /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py && sleep 0' 16380 1727204182.64949: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204182.64972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204182.64987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204182.65261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204182.65306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204182.65412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204182.67325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204182.67399: stderr chunk (state=3): >>><<< 16380 1727204182.67411: stdout chunk (state=3): >>><<< 16380 1727204182.67584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204182.67591: _low_level_execute_command(): starting 16380 1727204182.67594: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/AnsiballZ_package_facts.py && sleep 0' 16380 1727204182.68793: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204182.69061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204182.69064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204182.69351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204183.34064: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 16380 1727204183.34162: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 16380 1727204183.34313: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 16380 1727204183.34399: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 16380 1727204183.34406: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 16380 1727204183.34410: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 16380 1727204183.34416: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 16380 1727204183.36319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204183.36342: stderr chunk (state=3): >>><<< 16380 1727204183.36349: stdout chunk (state=3): >>><<< 16380 1727204183.36606: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204183.47805: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204183.47849: _low_level_execute_command(): starting 16380 1727204183.47871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204182.5021052-19296-253077652568552/ > /dev/null 2>&1 && sleep 0' 16380 1727204183.48682: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204183.48700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204183.48753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204183.48769: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204183.48782: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 16380 1727204183.48865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204183.48907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204183.48941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204183.48977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204183.49175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204183.51417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204183.51423: stdout chunk (state=3): >>><<< 16380 1727204183.51434: stderr chunk (state=3): >>><<< 16380 1727204183.51456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204183.51470: handler run complete 16380 1727204183.54856: variable 'ansible_facts' from source: unknown 16380 1727204183.56997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204183.65658: variable 'ansible_facts' from source: unknown 16380 1727204183.67400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204183.70554: attempt loop complete, returning result 16380 1727204183.70596: _execute() done 16380 1727204183.70599: dumping result to json 16380 1727204183.71360: done dumping result, returning 16380 1727204183.71371: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-749c-b6eb-0000000003e8] 16380 1727204183.71376: sending task result for task 12b410aa-8751-749c-b6eb-0000000003e8 16380 1727204183.89547: done sending task result for task 12b410aa-8751-749c-b6eb-0000000003e8 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204183.89636: no more pending results, returning what we have 16380 1727204183.89640: results queue empty 16380 1727204183.89641: checking for any_errors_fatal 16380 1727204183.89646: done checking for any_errors_fatal 16380 1727204183.89647: checking for max_fail_percentage 16380 1727204183.89651: done checking for max_fail_percentage 16380 1727204183.89652: checking to see if all hosts have failed and the running result is not ok 16380 1727204183.89653: done checking to see if all hosts have failed 16380 1727204183.89654: getting the remaining hosts for this loop 16380 1727204183.89655: done getting the remaining hosts for this loop 16380 1727204183.89659: getting the next task for host managed-node2 16380 1727204183.89667: done getting next task for host managed-node2 16380 1727204183.89671: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 16380 1727204183.89673: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204183.89684: getting variables 16380 1727204183.89685: in VariableManager get_vars() 16380 1727204183.89715: Calling all_inventory to load vars for managed-node2 16380 1727204183.89720: Calling groups_inventory to load vars for managed-node2 16380 1727204183.89723: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204183.89731: Calling all_plugins_play to load vars for managed-node2 16380 1727204183.89734: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204183.89963: Calling groups_plugins_play to load vars for managed-node2 16380 1727204183.90485: WORKER PROCESS EXITING 16380 1727204183.94976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204183.99038: done with get_vars() 16380 1727204183.99080: done getting variables 16380 1727204183.99148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:23 -0400 (0:00:01.588) 0:00:45.098 ***** 16380 1727204183.99181: entering _queue_task() for managed-node2/debug 16380 1727204183.99610: worker is 1 (out of 1 available) 16380 1727204183.99624: exiting _queue_task() for managed-node2/debug 16380 1727204183.99635: done queuing things up, now waiting for results queue to drain 16380 1727204183.99638: waiting for pending results... 16380 1727204183.99900: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 16380 1727204184.00038: in run() - task 12b410aa-8751-749c-b6eb-00000000005b 16380 1727204184.00063: variable 'ansible_search_path' from source: unknown 16380 1727204184.00072: variable 'ansible_search_path' from source: unknown 16380 1727204184.00128: calling self._execute() 16380 1727204184.00253: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.00268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.00285: variable 'omit' from source: magic vars 16380 1727204184.00762: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.00780: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204184.00794: variable 'omit' from source: magic vars 16380 1727204184.00851: variable 'omit' from source: magic vars 16380 1727204184.00984: variable 'network_provider' from source: set_fact 16380 1727204184.01011: variable 'omit' from source: magic vars 16380 1727204184.01067: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204184.01117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204184.01143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204184.01178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204184.01201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204184.01240: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204184.01250: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.01259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.01394: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204184.01413: Set connection var ansible_shell_executable to /bin/sh 16380 1727204184.01425: Set connection var ansible_connection to ssh 16380 1727204184.01436: Set connection var ansible_shell_type to sh 16380 1727204184.01447: Set connection var ansible_pipelining to False 16380 1727204184.01494: Set connection var ansible_timeout to 10 16380 1727204184.01499: variable 'ansible_shell_executable' from source: unknown 16380 1727204184.01506: variable 'ansible_connection' from source: unknown 16380 1727204184.01517: variable 'ansible_module_compression' from source: unknown 16380 1727204184.01525: variable 'ansible_shell_type' from source: unknown 16380 1727204184.01533: variable 'ansible_shell_executable' from source: unknown 16380 1727204184.01540: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.01597: variable 'ansible_pipelining' from source: unknown 16380 1727204184.01601: variable 'ansible_timeout' from source: unknown 16380 1727204184.01603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.01818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204184.01853: variable 'omit' from source: magic vars 16380 1727204184.01864: starting attempt loop 16380 1727204184.01894: running the handler 16380 1727204184.02167: handler run complete 16380 1727204184.02171: attempt loop complete, returning result 16380 1727204184.02173: _execute() done 16380 1727204184.02176: dumping result to json 16380 1727204184.02179: done dumping result, returning 16380 1727204184.02181: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-749c-b6eb-00000000005b] 16380 1727204184.02184: sending task result for task 12b410aa-8751-749c-b6eb-00000000005b ok: [managed-node2] => {} MSG: Using network provider: nm 16380 1727204184.02443: no more pending results, returning what we have 16380 1727204184.02448: results queue empty 16380 1727204184.02449: checking for any_errors_fatal 16380 1727204184.02463: done checking for any_errors_fatal 16380 1727204184.02464: checking for max_fail_percentage 16380 1727204184.02466: done checking for max_fail_percentage 16380 1727204184.02467: checking to see if all hosts have failed and the running result is not ok 16380 1727204184.02468: done checking to see if all hosts have failed 16380 1727204184.02469: getting the remaining hosts for this loop 16380 1727204184.02471: done getting the remaining hosts for this loop 16380 1727204184.02476: getting the next task for host managed-node2 16380 1727204184.02486: done getting next task for host managed-node2 16380 1727204184.02490: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16380 1727204184.02492: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204184.02506: getting variables 16380 1727204184.02508: in VariableManager get_vars() 16380 1727204184.02552: Calling all_inventory to load vars for managed-node2 16380 1727204184.02555: Calling groups_inventory to load vars for managed-node2 16380 1727204184.02558: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204184.02570: Calling all_plugins_play to load vars for managed-node2 16380 1727204184.02574: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204184.02578: Calling groups_plugins_play to load vars for managed-node2 16380 1727204184.03217: done sending task result for task 12b410aa-8751-749c-b6eb-00000000005b 16380 1727204184.03220: WORKER PROCESS EXITING 16380 1727204184.07595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204184.14019: done with get_vars() 16380 1727204184.14070: done getting variables 16380 1727204184.14261: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.152) 0:00:45.250 ***** 16380 1727204184.14412: entering _queue_task() for managed-node2/fail 16380 1727204184.15213: worker is 1 (out of 1 available) 16380 1727204184.15226: exiting _queue_task() for managed-node2/fail 16380 1727204184.15239: done queuing things up, now waiting for results queue to drain 16380 1727204184.15242: waiting for pending results... 16380 1727204184.15716: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 16380 1727204184.16154: in run() - task 12b410aa-8751-749c-b6eb-00000000005c 16380 1727204184.16160: variable 'ansible_search_path' from source: unknown 16380 1727204184.16164: variable 'ansible_search_path' from source: unknown 16380 1727204184.16168: calling self._execute() 16380 1727204184.16421: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.16439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.16496: variable 'omit' from source: magic vars 16380 1727204184.17540: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.17569: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204184.17988: variable 'network_state' from source: role '' defaults 16380 1727204184.17994: Evaluated conditional (network_state != {}): False 16380 1727204184.18018: when evaluation is False, skipping this task 16380 1727204184.18028: _execute() done 16380 1727204184.18037: dumping result to json 16380 1727204184.18046: done dumping result, returning 16380 1727204184.18133: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-749c-b6eb-00000000005c] 16380 1727204184.18146: sending task result for task 12b410aa-8751-749c-b6eb-00000000005c skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204184.18499: no more pending results, returning what we have 16380 1727204184.18503: results queue empty 16380 1727204184.18504: checking for any_errors_fatal 16380 1727204184.18513: done checking for any_errors_fatal 16380 1727204184.18514: checking for max_fail_percentage 16380 1727204184.18515: done checking for max_fail_percentage 16380 1727204184.18516: checking to see if all hosts have failed and the running result is not ok 16380 1727204184.18517: done checking to see if all hosts have failed 16380 1727204184.18518: getting the remaining hosts for this loop 16380 1727204184.18520: done getting the remaining hosts for this loop 16380 1727204184.18525: getting the next task for host managed-node2 16380 1727204184.18532: done getting next task for host managed-node2 16380 1727204184.18535: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16380 1727204184.18539: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204184.18557: getting variables 16380 1727204184.18559: in VariableManager get_vars() 16380 1727204184.18613: Calling all_inventory to load vars for managed-node2 16380 1727204184.18617: Calling groups_inventory to load vars for managed-node2 16380 1727204184.18620: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204184.18717: Calling all_plugins_play to load vars for managed-node2 16380 1727204184.18721: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204184.18725: Calling groups_plugins_play to load vars for managed-node2 16380 1727204184.19551: done sending task result for task 12b410aa-8751-749c-b6eb-00000000005c 16380 1727204184.19556: WORKER PROCESS EXITING 16380 1727204184.24236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204184.30945: done with get_vars() 16380 1727204184.30997: done getting variables 16380 1727204184.31127: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.167) 0:00:45.418 ***** 16380 1727204184.31207: entering _queue_task() for managed-node2/fail 16380 1727204184.31977: worker is 1 (out of 1 available) 16380 1727204184.32140: exiting _queue_task() for managed-node2/fail 16380 1727204184.32153: done queuing things up, now waiting for results queue to drain 16380 1727204184.32156: waiting for pending results... 16380 1727204184.32926: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 16380 1727204184.33570: in run() - task 12b410aa-8751-749c-b6eb-00000000005d 16380 1727204184.33587: variable 'ansible_search_path' from source: unknown 16380 1727204184.33592: variable 'ansible_search_path' from source: unknown 16380 1727204184.33640: calling self._execute() 16380 1727204184.34182: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.34194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.34615: variable 'omit' from source: magic vars 16380 1727204184.35830: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.35845: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204184.36106: variable 'network_state' from source: role '' defaults 16380 1727204184.36118: Evaluated conditional (network_state != {}): False 16380 1727204184.36123: when evaluation is False, skipping this task 16380 1727204184.36127: _execute() done 16380 1727204184.36249: dumping result to json 16380 1727204184.36253: done dumping result, returning 16380 1727204184.36264: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-749c-b6eb-00000000005d] 16380 1727204184.36270: sending task result for task 12b410aa-8751-749c-b6eb-00000000005d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204184.36498: no more pending results, returning what we have 16380 1727204184.36503: results queue empty 16380 1727204184.36504: checking for any_errors_fatal 16380 1727204184.36513: done checking for any_errors_fatal 16380 1727204184.36515: checking for max_fail_percentage 16380 1727204184.36517: done checking for max_fail_percentage 16380 1727204184.36518: checking to see if all hosts have failed and the running result is not ok 16380 1727204184.36519: done checking to see if all hosts have failed 16380 1727204184.36521: getting the remaining hosts for this loop 16380 1727204184.36523: done getting the remaining hosts for this loop 16380 1727204184.36527: getting the next task for host managed-node2 16380 1727204184.36535: done getting next task for host managed-node2 16380 1727204184.36539: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16380 1727204184.36542: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204184.36561: getting variables 16380 1727204184.36564: in VariableManager get_vars() 16380 1727204184.36613: Calling all_inventory to load vars for managed-node2 16380 1727204184.36617: Calling groups_inventory to load vars for managed-node2 16380 1727204184.36620: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204184.36636: Calling all_plugins_play to load vars for managed-node2 16380 1727204184.36639: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204184.36643: Calling groups_plugins_play to load vars for managed-node2 16380 1727204184.37210: done sending task result for task 12b410aa-8751-749c-b6eb-00000000005d 16380 1727204184.37214: WORKER PROCESS EXITING 16380 1727204184.40706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204184.45336: done with get_vars() 16380 1727204184.45395: done getting variables 16380 1727204184.45475: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.143) 0:00:45.561 ***** 16380 1727204184.45515: entering _queue_task() for managed-node2/fail 16380 1727204184.45927: worker is 1 (out of 1 available) 16380 1727204184.45942: exiting _queue_task() for managed-node2/fail 16380 1727204184.45957: done queuing things up, now waiting for results queue to drain 16380 1727204184.45959: waiting for pending results... 16380 1727204184.46293: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 16380 1727204184.46525: in run() - task 12b410aa-8751-749c-b6eb-00000000005e 16380 1727204184.46529: variable 'ansible_search_path' from source: unknown 16380 1727204184.46532: variable 'ansible_search_path' from source: unknown 16380 1727204184.46535: calling self._execute() 16380 1727204184.46642: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.46655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.46673: variable 'omit' from source: magic vars 16380 1727204184.47910: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.47914: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204184.48980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204184.54708: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204184.54803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204184.54852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204184.54907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204184.54942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204184.55047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204184.55099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204184.55295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.55299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204184.55301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204184.55329: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.55351: Evaluated conditional (ansible_distribution_major_version | int > 9): True 16380 1727204184.55515: variable 'ansible_distribution' from source: facts 16380 1727204184.55534: variable '__network_rh_distros' from source: role '' defaults 16380 1727204184.55550: Evaluated conditional (ansible_distribution in __network_rh_distros): False 16380 1727204184.55559: when evaluation is False, skipping this task 16380 1727204184.55567: _execute() done 16380 1727204184.55574: dumping result to json 16380 1727204184.55583: done dumping result, returning 16380 1727204184.55599: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-749c-b6eb-00000000005e] 16380 1727204184.55612: sending task result for task 12b410aa-8751-749c-b6eb-00000000005e skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 16380 1727204184.55795: no more pending results, returning what we have 16380 1727204184.55801: results queue empty 16380 1727204184.55803: checking for any_errors_fatal 16380 1727204184.55812: done checking for any_errors_fatal 16380 1727204184.55814: checking for max_fail_percentage 16380 1727204184.55816: done checking for max_fail_percentage 16380 1727204184.55817: checking to see if all hosts have failed and the running result is not ok 16380 1727204184.55818: done checking to see if all hosts have failed 16380 1727204184.55821: getting the remaining hosts for this loop 16380 1727204184.55823: done getting the remaining hosts for this loop 16380 1727204184.55829: getting the next task for host managed-node2 16380 1727204184.55837: done getting next task for host managed-node2 16380 1727204184.55841: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16380 1727204184.55844: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204184.55864: getting variables 16380 1727204184.55867: in VariableManager get_vars() 16380 1727204184.55926: Calling all_inventory to load vars for managed-node2 16380 1727204184.55932: Calling groups_inventory to load vars for managed-node2 16380 1727204184.55936: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204184.55951: Calling all_plugins_play to load vars for managed-node2 16380 1727204184.55954: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204184.55958: Calling groups_plugins_play to load vars for managed-node2 16380 1727204184.56708: done sending task result for task 12b410aa-8751-749c-b6eb-00000000005e 16380 1727204184.56711: WORKER PROCESS EXITING 16380 1727204184.60375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204184.64384: done with get_vars() 16380 1727204184.64429: done getting variables 16380 1727204184.64659: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.191) 0:00:45.753 ***** 16380 1727204184.64737: entering _queue_task() for managed-node2/dnf 16380 1727204184.65498: worker is 1 (out of 1 available) 16380 1727204184.65573: exiting _queue_task() for managed-node2/dnf 16380 1727204184.65585: done queuing things up, now waiting for results queue to drain 16380 1727204184.65587: waiting for pending results... 16380 1727204184.65926: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 16380 1727204184.66074: in run() - task 12b410aa-8751-749c-b6eb-00000000005f 16380 1727204184.66130: variable 'ansible_search_path' from source: unknown 16380 1727204184.66154: variable 'ansible_search_path' from source: unknown 16380 1727204184.66233: calling self._execute() 16380 1727204184.66370: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.66480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.66485: variable 'omit' from source: magic vars 16380 1727204184.67346: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.67483: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204184.68038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204184.72705: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204184.72801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204184.72856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204184.72903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204184.72938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204184.73047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204184.73111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204184.73151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.73217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204184.73422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204184.73809: variable 'ansible_distribution' from source: facts 16380 1727204184.73813: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.73815: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 16380 1727204184.74447: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204184.74996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204184.75000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204184.75003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.75005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204184.75008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204184.75147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204184.75311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204184.75353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.75448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204184.75526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204184.75627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204184.75666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204184.75705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.75773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204184.75798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204184.76000: variable 'network_connections' from source: play vars 16380 1727204184.76019: variable 'profile' from source: play vars 16380 1727204184.76166: variable 'profile' from source: play vars 16380 1727204184.76177: variable 'interface' from source: set_fact 16380 1727204184.76269: variable 'interface' from source: set_fact 16380 1727204184.76437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204184.76797: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204184.76945: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204184.76991: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204184.77194: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204184.77223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204184.77261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204184.77375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.77417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204184.77664: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204184.78277: variable 'network_connections' from source: play vars 16380 1727204184.78385: variable 'profile' from source: play vars 16380 1727204184.78476: variable 'profile' from source: play vars 16380 1727204184.78505: variable 'interface' from source: set_fact 16380 1727204184.78637: variable 'interface' from source: set_fact 16380 1727204184.78674: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204184.78683: when evaluation is False, skipping this task 16380 1727204184.78695: _execute() done 16380 1727204184.78703: dumping result to json 16380 1727204184.78712: done dumping result, returning 16380 1727204184.78726: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-00000000005f] 16380 1727204184.78738: sending task result for task 12b410aa-8751-749c-b6eb-00000000005f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204184.79053: no more pending results, returning what we have 16380 1727204184.79057: results queue empty 16380 1727204184.79058: checking for any_errors_fatal 16380 1727204184.79066: done checking for any_errors_fatal 16380 1727204184.79067: checking for max_fail_percentage 16380 1727204184.79069: done checking for max_fail_percentage 16380 1727204184.79070: checking to see if all hosts have failed and the running result is not ok 16380 1727204184.79071: done checking to see if all hosts have failed 16380 1727204184.79072: getting the remaining hosts for this loop 16380 1727204184.79074: done getting the remaining hosts for this loop 16380 1727204184.79079: getting the next task for host managed-node2 16380 1727204184.79088: done getting next task for host managed-node2 16380 1727204184.79094: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16380 1727204184.79097: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204184.79115: getting variables 16380 1727204184.79117: in VariableManager get_vars() 16380 1727204184.79164: Calling all_inventory to load vars for managed-node2 16380 1727204184.79168: Calling groups_inventory to load vars for managed-node2 16380 1727204184.79171: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204184.79184: Calling all_plugins_play to load vars for managed-node2 16380 1727204184.79188: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204184.79702: Calling groups_plugins_play to load vars for managed-node2 16380 1727204184.80398: done sending task result for task 12b410aa-8751-749c-b6eb-00000000005f 16380 1727204184.80402: WORKER PROCESS EXITING 16380 1727204184.82613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204184.88098: done with get_vars() 16380 1727204184.88146: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 16380 1727204184.88241: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.235) 0:00:45.989 ***** 16380 1727204184.88279: entering _queue_task() for managed-node2/yum 16380 1727204184.88647: worker is 1 (out of 1 available) 16380 1727204184.88661: exiting _queue_task() for managed-node2/yum 16380 1727204184.88676: done queuing things up, now waiting for results queue to drain 16380 1727204184.88678: waiting for pending results... 16380 1727204184.88961: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 16380 1727204184.89100: in run() - task 12b410aa-8751-749c-b6eb-000000000060 16380 1727204184.89130: variable 'ansible_search_path' from source: unknown 16380 1727204184.89140: variable 'ansible_search_path' from source: unknown 16380 1727204184.89191: calling self._execute() 16380 1727204184.89412: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204184.89427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204184.89445: variable 'omit' from source: magic vars 16380 1727204184.89910: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.89932: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204184.90178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204184.93162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204184.93255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204184.93335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204184.93359: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204184.93402: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204184.93511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204184.93595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204184.93610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204184.93670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204184.93697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204184.93868: variable 'ansible_distribution_major_version' from source: facts 16380 1727204184.93875: Evaluated conditional (ansible_distribution_major_version | int < 8): False 16380 1727204184.93882: when evaluation is False, skipping this task 16380 1727204184.93885: _execute() done 16380 1727204184.93903: dumping result to json 16380 1727204184.93936: done dumping result, returning 16380 1727204184.93993: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000060] 16380 1727204184.94006: sending task result for task 12b410aa-8751-749c-b6eb-000000000060 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 16380 1727204184.94316: no more pending results, returning what we have 16380 1727204184.94320: results queue empty 16380 1727204184.94322: checking for any_errors_fatal 16380 1727204184.94361: done checking for any_errors_fatal 16380 1727204184.94362: checking for max_fail_percentage 16380 1727204184.94364: done checking for max_fail_percentage 16380 1727204184.94365: checking to see if all hosts have failed and the running result is not ok 16380 1727204184.94396: done checking to see if all hosts have failed 16380 1727204184.94398: getting the remaining hosts for this loop 16380 1727204184.94407: done getting the remaining hosts for this loop 16380 1727204184.94412: getting the next task for host managed-node2 16380 1727204184.94465: done getting next task for host managed-node2 16380 1727204184.94471: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16380 1727204184.94474: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204184.94503: getting variables 16380 1727204184.94506: in VariableManager get_vars() 16380 1727204184.94559: Calling all_inventory to load vars for managed-node2 16380 1727204184.94564: Calling groups_inventory to load vars for managed-node2 16380 1727204184.94567: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204184.94584: Calling all_plugins_play to load vars for managed-node2 16380 1727204184.94709: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204184.94760: done sending task result for task 12b410aa-8751-749c-b6eb-000000000060 16380 1727204184.94763: WORKER PROCESS EXITING 16380 1727204184.94768: Calling groups_plugins_play to load vars for managed-node2 16380 1727204185.00295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204185.04868: done with get_vars() 16380 1727204185.04937: done getting variables 16380 1727204185.05398: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.171) 0:00:46.160 ***** 16380 1727204185.05436: entering _queue_task() for managed-node2/fail 16380 1727204185.05844: worker is 1 (out of 1 available) 16380 1727204185.05859: exiting _queue_task() for managed-node2/fail 16380 1727204185.05873: done queuing things up, now waiting for results queue to drain 16380 1727204185.05875: waiting for pending results... 16380 1727204185.06196: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 16380 1727204185.06341: in run() - task 12b410aa-8751-749c-b6eb-000000000061 16380 1727204185.06365: variable 'ansible_search_path' from source: unknown 16380 1727204185.06375: variable 'ansible_search_path' from source: unknown 16380 1727204185.06430: calling self._execute() 16380 1727204185.06556: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.06571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.06595: variable 'omit' from source: magic vars 16380 1727204185.07215: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.07235: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204185.07417: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204185.07767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204185.11485: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204185.11611: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204185.11673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204185.11737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204185.11776: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204185.11937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.12072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.12076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.12103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.12130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.12198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.12235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.12272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.12345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.12366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.12424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.12458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.12493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.12575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.12586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.12824: variable 'network_connections' from source: play vars 16380 1727204185.12902: variable 'profile' from source: play vars 16380 1727204185.12979: variable 'profile' from source: play vars 16380 1727204185.13012: variable 'interface' from source: set_fact 16380 1727204185.13166: variable 'interface' from source: set_fact 16380 1727204185.13251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204185.13596: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204185.13634: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204185.13742: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204185.13865: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204185.14152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204185.14156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204185.14158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.14161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204185.14325: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204185.14984: variable 'network_connections' from source: play vars 16380 1727204185.15053: variable 'profile' from source: play vars 16380 1727204185.15264: variable 'profile' from source: play vars 16380 1727204185.15267: variable 'interface' from source: set_fact 16380 1727204185.15338: variable 'interface' from source: set_fact 16380 1727204185.15514: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204185.15527: when evaluation is False, skipping this task 16380 1727204185.15536: _execute() done 16380 1727204185.15545: dumping result to json 16380 1727204185.15555: done dumping result, returning 16380 1727204185.15568: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000061] 16380 1727204185.15590: sending task result for task 12b410aa-8751-749c-b6eb-000000000061 16380 1727204185.15909: done sending task result for task 12b410aa-8751-749c-b6eb-000000000061 16380 1727204185.15913: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204185.15978: no more pending results, returning what we have 16380 1727204185.15983: results queue empty 16380 1727204185.15984: checking for any_errors_fatal 16380 1727204185.15995: done checking for any_errors_fatal 16380 1727204185.15996: checking for max_fail_percentage 16380 1727204185.15998: done checking for max_fail_percentage 16380 1727204185.15999: checking to see if all hosts have failed and the running result is not ok 16380 1727204185.16000: done checking to see if all hosts have failed 16380 1727204185.16002: getting the remaining hosts for this loop 16380 1727204185.16004: done getting the remaining hosts for this loop 16380 1727204185.16008: getting the next task for host managed-node2 16380 1727204185.16020: done getting next task for host managed-node2 16380 1727204185.16025: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 16380 1727204185.16027: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204185.16042: getting variables 16380 1727204185.16044: in VariableManager get_vars() 16380 1727204185.16394: Calling all_inventory to load vars for managed-node2 16380 1727204185.16398: Calling groups_inventory to load vars for managed-node2 16380 1727204185.16402: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204185.16413: Calling all_plugins_play to load vars for managed-node2 16380 1727204185.16420: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204185.16425: Calling groups_plugins_play to load vars for managed-node2 16380 1727204185.19043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204185.22916: done with get_vars() 16380 1727204185.22957: done getting variables 16380 1727204185.23031: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.176) 0:00:46.337 ***** 16380 1727204185.23069: entering _queue_task() for managed-node2/package 16380 1727204185.23407: worker is 1 (out of 1 available) 16380 1727204185.23423: exiting _queue_task() for managed-node2/package 16380 1727204185.23436: done queuing things up, now waiting for results queue to drain 16380 1727204185.23438: waiting for pending results... 16380 1727204185.23795: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 16380 1727204185.24429: in run() - task 12b410aa-8751-749c-b6eb-000000000062 16380 1727204185.24434: variable 'ansible_search_path' from source: unknown 16380 1727204185.24437: variable 'ansible_search_path' from source: unknown 16380 1727204185.24440: calling self._execute() 16380 1727204185.24687: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.24766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.24785: variable 'omit' from source: magic vars 16380 1727204185.25697: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.25826: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204185.26215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204185.27022: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204185.27080: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204185.27158: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204185.27465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204185.27736: variable 'network_packages' from source: role '' defaults 16380 1727204185.27996: variable '__network_provider_setup' from source: role '' defaults 16380 1727204185.28213: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204185.28300: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204185.28320: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204185.28403: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204185.28986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204185.33958: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204185.34103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204185.34345: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204185.34395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204185.34435: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204185.34544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.34711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.34815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.34876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.35021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.35083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.35139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.35234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.35375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.35399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.36034: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16380 1727204185.36378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.36416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.36649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.36652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.36655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.37083: variable 'ansible_python' from source: facts 16380 1727204185.37086: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16380 1727204185.37241: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204185.37466: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204185.37862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.37894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.37986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.38048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.38281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.38284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.38295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.38413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.38467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.38797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.39025: variable 'network_connections' from source: play vars 16380 1727204185.39038: variable 'profile' from source: play vars 16380 1727204185.39170: variable 'profile' from source: play vars 16380 1727204185.39245: variable 'interface' from source: set_fact 16380 1727204185.39331: variable 'interface' from source: set_fact 16380 1727204185.39546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204185.39709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204185.39755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.39837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204185.39947: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204185.40811: variable 'network_connections' from source: play vars 16380 1727204185.40877: variable 'profile' from source: play vars 16380 1727204185.41121: variable 'profile' from source: play vars 16380 1727204185.41298: variable 'interface' from source: set_fact 16380 1727204185.41302: variable 'interface' from source: set_fact 16380 1727204185.41460: variable '__network_packages_default_wireless' from source: role '' defaults 16380 1727204185.41897: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204185.42481: variable 'network_connections' from source: play vars 16380 1727204185.42563: variable 'profile' from source: play vars 16380 1727204185.42722: variable 'profile' from source: play vars 16380 1727204185.42733: variable 'interface' from source: set_fact 16380 1727204185.42972: variable 'interface' from source: set_fact 16380 1727204185.43133: variable '__network_packages_default_team' from source: role '' defaults 16380 1727204185.43354: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204185.44069: variable 'network_connections' from source: play vars 16380 1727204185.44204: variable 'profile' from source: play vars 16380 1727204185.44395: variable 'profile' from source: play vars 16380 1727204185.44398: variable 'interface' from source: set_fact 16380 1727204185.44655: variable 'interface' from source: set_fact 16380 1727204185.44803: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204185.45000: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204185.45014: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204185.45206: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204185.45846: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16380 1727204185.47205: variable 'network_connections' from source: play vars 16380 1727204185.47221: variable 'profile' from source: play vars 16380 1727204185.47554: variable 'profile' from source: play vars 16380 1727204185.47558: variable 'interface' from source: set_fact 16380 1727204185.47560: variable 'interface' from source: set_fact 16380 1727204185.47562: variable 'ansible_distribution' from source: facts 16380 1727204185.47565: variable '__network_rh_distros' from source: role '' defaults 16380 1727204185.47603: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.47684: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16380 1727204185.48142: variable 'ansible_distribution' from source: facts 16380 1727204185.48152: variable '__network_rh_distros' from source: role '' defaults 16380 1727204185.48216: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.48234: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16380 1727204185.48664: variable 'ansible_distribution' from source: facts 16380 1727204185.48715: variable '__network_rh_distros' from source: role '' defaults 16380 1727204185.48732: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.48927: variable 'network_provider' from source: set_fact 16380 1727204185.48931: variable 'ansible_facts' from source: unknown 16380 1727204185.51381: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 16380 1727204185.51385: when evaluation is False, skipping this task 16380 1727204185.51388: _execute() done 16380 1727204185.51392: dumping result to json 16380 1727204185.51398: done dumping result, returning 16380 1727204185.51408: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-749c-b6eb-000000000062] 16380 1727204185.51415: sending task result for task 12b410aa-8751-749c-b6eb-000000000062 16380 1727204185.51701: done sending task result for task 12b410aa-8751-749c-b6eb-000000000062 16380 1727204185.51704: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 16380 1727204185.51755: no more pending results, returning what we have 16380 1727204185.51759: results queue empty 16380 1727204185.51760: checking for any_errors_fatal 16380 1727204185.51767: done checking for any_errors_fatal 16380 1727204185.51768: checking for max_fail_percentage 16380 1727204185.51769: done checking for max_fail_percentage 16380 1727204185.51770: checking to see if all hosts have failed and the running result is not ok 16380 1727204185.51771: done checking to see if all hosts have failed 16380 1727204185.51772: getting the remaining hosts for this loop 16380 1727204185.51774: done getting the remaining hosts for this loop 16380 1727204185.51777: getting the next task for host managed-node2 16380 1727204185.51783: done getting next task for host managed-node2 16380 1727204185.51787: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16380 1727204185.51791: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204185.51805: getting variables 16380 1727204185.51806: in VariableManager get_vars() 16380 1727204185.51847: Calling all_inventory to load vars for managed-node2 16380 1727204185.51850: Calling groups_inventory to load vars for managed-node2 16380 1727204185.51852: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204185.51867: Calling all_plugins_play to load vars for managed-node2 16380 1727204185.51870: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204185.51874: Calling groups_plugins_play to load vars for managed-node2 16380 1727204185.55115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204185.58207: done with get_vars() 16380 1727204185.58253: done getting variables 16380 1727204185.58330: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.352) 0:00:46.690 ***** 16380 1727204185.58368: entering _queue_task() for managed-node2/package 16380 1727204185.58742: worker is 1 (out of 1 available) 16380 1727204185.58757: exiting _queue_task() for managed-node2/package 16380 1727204185.58771: done queuing things up, now waiting for results queue to drain 16380 1727204185.58773: waiting for pending results... 16380 1727204185.59084: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 16380 1727204185.59231: in run() - task 12b410aa-8751-749c-b6eb-000000000063 16380 1727204185.59254: variable 'ansible_search_path' from source: unknown 16380 1727204185.59262: variable 'ansible_search_path' from source: unknown 16380 1727204185.59310: calling self._execute() 16380 1727204185.59595: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.59600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.59603: variable 'omit' from source: magic vars 16380 1727204185.59946: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.59968: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204185.60145: variable 'network_state' from source: role '' defaults 16380 1727204185.60168: Evaluated conditional (network_state != {}): False 16380 1727204185.60177: when evaluation is False, skipping this task 16380 1727204185.60186: _execute() done 16380 1727204185.60196: dumping result to json 16380 1727204185.60206: done dumping result, returning 16380 1727204185.60222: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-749c-b6eb-000000000063] 16380 1727204185.60235: sending task result for task 12b410aa-8751-749c-b6eb-000000000063 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204185.60424: no more pending results, returning what we have 16380 1727204185.60429: results queue empty 16380 1727204185.60430: checking for any_errors_fatal 16380 1727204185.60437: done checking for any_errors_fatal 16380 1727204185.60438: checking for max_fail_percentage 16380 1727204185.60440: done checking for max_fail_percentage 16380 1727204185.60441: checking to see if all hosts have failed and the running result is not ok 16380 1727204185.60442: done checking to see if all hosts have failed 16380 1727204185.60444: getting the remaining hosts for this loop 16380 1727204185.60446: done getting the remaining hosts for this loop 16380 1727204185.60450: getting the next task for host managed-node2 16380 1727204185.60458: done getting next task for host managed-node2 16380 1727204185.60463: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16380 1727204185.60466: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204185.60483: getting variables 16380 1727204185.60485: in VariableManager get_vars() 16380 1727204185.60536: Calling all_inventory to load vars for managed-node2 16380 1727204185.60540: Calling groups_inventory to load vars for managed-node2 16380 1727204185.60543: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204185.60559: Calling all_plugins_play to load vars for managed-node2 16380 1727204185.60563: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204185.60567: Calling groups_plugins_play to load vars for managed-node2 16380 1727204185.61407: done sending task result for task 12b410aa-8751-749c-b6eb-000000000063 16380 1727204185.61411: WORKER PROCESS EXITING 16380 1727204185.63036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204185.66847: done with get_vars() 16380 1727204185.66898: done getting variables 16380 1727204185.66976: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.086) 0:00:46.776 ***** 16380 1727204185.67020: entering _queue_task() for managed-node2/package 16380 1727204185.67397: worker is 1 (out of 1 available) 16380 1727204185.67412: exiting _queue_task() for managed-node2/package 16380 1727204185.67428: done queuing things up, now waiting for results queue to drain 16380 1727204185.67431: waiting for pending results... 16380 1727204185.67723: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 16380 1727204185.67864: in run() - task 12b410aa-8751-749c-b6eb-000000000064 16380 1727204185.67892: variable 'ansible_search_path' from source: unknown 16380 1727204185.67936: variable 'ansible_search_path' from source: unknown 16380 1727204185.68054: calling self._execute() 16380 1727204185.68293: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.68495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.68499: variable 'omit' from source: magic vars 16380 1727204185.69378: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.69402: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204185.69651: variable 'network_state' from source: role '' defaults 16380 1727204185.69795: Evaluated conditional (network_state != {}): False 16380 1727204185.69805: when evaluation is False, skipping this task 16380 1727204185.69814: _execute() done 16380 1727204185.69825: dumping result to json 16380 1727204185.69834: done dumping result, returning 16380 1727204185.69847: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-749c-b6eb-000000000064] 16380 1727204185.69858: sending task result for task 12b410aa-8751-749c-b6eb-000000000064 16380 1727204185.70211: done sending task result for task 12b410aa-8751-749c-b6eb-000000000064 16380 1727204185.70215: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204185.70270: no more pending results, returning what we have 16380 1727204185.70274: results queue empty 16380 1727204185.70276: checking for any_errors_fatal 16380 1727204185.70284: done checking for any_errors_fatal 16380 1727204185.70285: checking for max_fail_percentage 16380 1727204185.70287: done checking for max_fail_percentage 16380 1727204185.70288: checking to see if all hosts have failed and the running result is not ok 16380 1727204185.70291: done checking to see if all hosts have failed 16380 1727204185.70292: getting the remaining hosts for this loop 16380 1727204185.70295: done getting the remaining hosts for this loop 16380 1727204185.70299: getting the next task for host managed-node2 16380 1727204185.70306: done getting next task for host managed-node2 16380 1727204185.70311: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16380 1727204185.70313: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204185.70333: getting variables 16380 1727204185.70335: in VariableManager get_vars() 16380 1727204185.70381: Calling all_inventory to load vars for managed-node2 16380 1727204185.70385: Calling groups_inventory to load vars for managed-node2 16380 1727204185.70409: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204185.70427: Calling all_plugins_play to load vars for managed-node2 16380 1727204185.70450: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204185.70460: Calling groups_plugins_play to load vars for managed-node2 16380 1727204185.73879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204185.77614: done with get_vars() 16380 1727204185.77659: done getting variables 16380 1727204185.77738: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.107) 0:00:46.884 ***** 16380 1727204185.77776: entering _queue_task() for managed-node2/service 16380 1727204185.78161: worker is 1 (out of 1 available) 16380 1727204185.78174: exiting _queue_task() for managed-node2/service 16380 1727204185.78187: done queuing things up, now waiting for results queue to drain 16380 1727204185.78190: waiting for pending results... 16380 1727204185.78507: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 16380 1727204185.78623: in run() - task 12b410aa-8751-749c-b6eb-000000000065 16380 1727204185.78797: variable 'ansible_search_path' from source: unknown 16380 1727204185.78802: variable 'ansible_search_path' from source: unknown 16380 1727204185.78805: calling self._execute() 16380 1727204185.78814: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.78823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.78835: variable 'omit' from source: magic vars 16380 1727204185.79324: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.79337: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204185.79512: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204185.79789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204185.82599: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204185.82664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204185.82693: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204185.82726: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204185.82753: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204185.82837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.82880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.82901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.82936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.82949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.82997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.83016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.83039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.83071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.83087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.83126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.83146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.83167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.83203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.83218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.83366: variable 'network_connections' from source: play vars 16380 1727204185.83377: variable 'profile' from source: play vars 16380 1727204185.83443: variable 'profile' from source: play vars 16380 1727204185.83446: variable 'interface' from source: set_fact 16380 1727204185.83499: variable 'interface' from source: set_fact 16380 1727204185.83568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204185.83704: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204185.83741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204185.83770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204185.83797: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204185.83837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204185.83860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204185.83883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.83906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204185.83952: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204185.84158: variable 'network_connections' from source: play vars 16380 1727204185.84162: variable 'profile' from source: play vars 16380 1727204185.84218: variable 'profile' from source: play vars 16380 1727204185.84226: variable 'interface' from source: set_fact 16380 1727204185.84275: variable 'interface' from source: set_fact 16380 1727204185.84302: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 16380 1727204185.84305: when evaluation is False, skipping this task 16380 1727204185.84308: _execute() done 16380 1727204185.84311: dumping result to json 16380 1727204185.84314: done dumping result, returning 16380 1727204185.84328: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-749c-b6eb-000000000065] 16380 1727204185.84337: sending task result for task 12b410aa-8751-749c-b6eb-000000000065 16380 1727204185.84429: done sending task result for task 12b410aa-8751-749c-b6eb-000000000065 16380 1727204185.84432: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 16380 1727204185.84481: no more pending results, returning what we have 16380 1727204185.84485: results queue empty 16380 1727204185.84486: checking for any_errors_fatal 16380 1727204185.84496: done checking for any_errors_fatal 16380 1727204185.84497: checking for max_fail_percentage 16380 1727204185.84499: done checking for max_fail_percentage 16380 1727204185.84500: checking to see if all hosts have failed and the running result is not ok 16380 1727204185.84501: done checking to see if all hosts have failed 16380 1727204185.84502: getting the remaining hosts for this loop 16380 1727204185.84503: done getting the remaining hosts for this loop 16380 1727204185.84507: getting the next task for host managed-node2 16380 1727204185.84515: done getting next task for host managed-node2 16380 1727204185.84519: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16380 1727204185.84522: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204185.84537: getting variables 16380 1727204185.84539: in VariableManager get_vars() 16380 1727204185.84581: Calling all_inventory to load vars for managed-node2 16380 1727204185.84585: Calling groups_inventory to load vars for managed-node2 16380 1727204185.84587: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204185.84608: Calling all_plugins_play to load vars for managed-node2 16380 1727204185.84611: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204185.84614: Calling groups_plugins_play to load vars for managed-node2 16380 1727204185.86714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204185.88640: done with get_vars() 16380 1727204185.88668: done getting variables 16380 1727204185.88723: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.109) 0:00:46.994 ***** 16380 1727204185.88749: entering _queue_task() for managed-node2/service 16380 1727204185.89013: worker is 1 (out of 1 available) 16380 1727204185.89032: exiting _queue_task() for managed-node2/service 16380 1727204185.89044: done queuing things up, now waiting for results queue to drain 16380 1727204185.89047: waiting for pending results... 16380 1727204185.89234: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 16380 1727204185.89314: in run() - task 12b410aa-8751-749c-b6eb-000000000066 16380 1727204185.89331: variable 'ansible_search_path' from source: unknown 16380 1727204185.89335: variable 'ansible_search_path' from source: unknown 16380 1727204185.89366: calling self._execute() 16380 1727204185.89456: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.89463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.89473: variable 'omit' from source: magic vars 16380 1727204185.89801: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.89814: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204185.89964: variable 'network_provider' from source: set_fact 16380 1727204185.89968: variable 'network_state' from source: role '' defaults 16380 1727204185.89979: Evaluated conditional (network_provider == "nm" or network_state != {}): True 16380 1727204185.89986: variable 'omit' from source: magic vars 16380 1727204185.90024: variable 'omit' from source: magic vars 16380 1727204185.90053: variable 'network_service_name' from source: role '' defaults 16380 1727204185.90110: variable 'network_service_name' from source: role '' defaults 16380 1727204185.90203: variable '__network_provider_setup' from source: role '' defaults 16380 1727204185.90209: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204185.90284: variable '__network_service_name_default_nm' from source: role '' defaults 16380 1727204185.90288: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204185.90406: variable '__network_packages_default_nm' from source: role '' defaults 16380 1727204185.90824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204185.92665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204185.92725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204185.92756: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204185.92801: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204185.92826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204185.92894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.92924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.92945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.92977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.92991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.93037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.93056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.93076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.93110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.93129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.93322: variable '__network_packages_default_gobject_packages' from source: role '' defaults 16380 1727204185.93421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.93441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.93466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.93497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.93510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.93588: variable 'ansible_python' from source: facts 16380 1727204185.93610: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 16380 1727204185.93680: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204185.93747: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204185.93856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.93881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.93905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.93938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.93951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.93994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204185.94023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204185.94041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.94071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204185.94083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204185.94201: variable 'network_connections' from source: play vars 16380 1727204185.94209: variable 'profile' from source: play vars 16380 1727204185.94272: variable 'profile' from source: play vars 16380 1727204185.94278: variable 'interface' from source: set_fact 16380 1727204185.94332: variable 'interface' from source: set_fact 16380 1727204185.94422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204185.94569: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204185.94610: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204185.94649: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204185.94698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204185.94766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204185.94796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204185.94860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204185.94864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204185.94927: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204185.95160: variable 'network_connections' from source: play vars 16380 1727204185.95167: variable 'profile' from source: play vars 16380 1727204185.95256: variable 'profile' from source: play vars 16380 1727204185.95262: variable 'interface' from source: set_fact 16380 1727204185.95394: variable 'interface' from source: set_fact 16380 1727204185.95397: variable '__network_packages_default_wireless' from source: role '' defaults 16380 1727204185.95482: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204185.95819: variable 'network_connections' from source: play vars 16380 1727204185.95832: variable 'profile' from source: play vars 16380 1727204185.95991: variable 'profile' from source: play vars 16380 1727204185.96025: variable 'interface' from source: set_fact 16380 1727204185.96048: variable 'interface' from source: set_fact 16380 1727204185.96070: variable '__network_packages_default_team' from source: role '' defaults 16380 1727204185.96165: variable '__network_team_connections_defined' from source: role '' defaults 16380 1727204185.96475: variable 'network_connections' from source: play vars 16380 1727204185.96479: variable 'profile' from source: play vars 16380 1727204185.96538: variable 'profile' from source: play vars 16380 1727204185.96542: variable 'interface' from source: set_fact 16380 1727204185.96607: variable 'interface' from source: set_fact 16380 1727204185.96653: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204185.96707: variable '__network_service_name_default_initscripts' from source: role '' defaults 16380 1727204185.96742: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204185.96790: variable '__network_packages_default_initscripts' from source: role '' defaults 16380 1727204185.97061: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 16380 1727204185.97582: variable 'network_connections' from source: play vars 16380 1727204185.97585: variable 'profile' from source: play vars 16380 1727204185.97637: variable 'profile' from source: play vars 16380 1727204185.97641: variable 'interface' from source: set_fact 16380 1727204185.97712: variable 'interface' from source: set_fact 16380 1727204185.97744: variable 'ansible_distribution' from source: facts 16380 1727204185.97756: variable '__network_rh_distros' from source: role '' defaults 16380 1727204185.97759: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.97761: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 16380 1727204185.98003: variable 'ansible_distribution' from source: facts 16380 1727204185.98031: variable '__network_rh_distros' from source: role '' defaults 16380 1727204185.98037: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.98042: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 16380 1727204185.98216: variable 'ansible_distribution' from source: facts 16380 1727204185.98223: variable '__network_rh_distros' from source: role '' defaults 16380 1727204185.98226: variable 'ansible_distribution_major_version' from source: facts 16380 1727204185.98256: variable 'network_provider' from source: set_fact 16380 1727204185.98277: variable 'omit' from source: magic vars 16380 1727204185.98306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204185.98331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204185.98346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204185.98364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204185.98374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204185.98435: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204185.98442: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.98445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.98571: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204185.98626: Set connection var ansible_shell_executable to /bin/sh 16380 1727204185.98663: Set connection var ansible_connection to ssh 16380 1727204185.98667: Set connection var ansible_shell_type to sh 16380 1727204185.98670: Set connection var ansible_pipelining to False 16380 1727204185.98672: Set connection var ansible_timeout to 10 16380 1727204185.98674: variable 'ansible_shell_executable' from source: unknown 16380 1727204185.98682: variable 'ansible_connection' from source: unknown 16380 1727204185.98685: variable 'ansible_module_compression' from source: unknown 16380 1727204185.98687: variable 'ansible_shell_type' from source: unknown 16380 1727204185.98692: variable 'ansible_shell_executable' from source: unknown 16380 1727204185.98747: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204185.98756: variable 'ansible_pipelining' from source: unknown 16380 1727204185.98759: variable 'ansible_timeout' from source: unknown 16380 1727204185.98761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204185.98924: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204185.98928: variable 'omit' from source: magic vars 16380 1727204185.98931: starting attempt loop 16380 1727204185.98933: running the handler 16380 1727204185.98988: variable 'ansible_facts' from source: unknown 16380 1727204186.00075: _low_level_execute_command(): starting 16380 1727204186.00079: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204186.00785: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.00791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.00794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.00797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.00880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204186.00885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.00954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.02901: stdout chunk (state=3): >>>/root <<< 16380 1727204186.03001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.03017: stderr chunk (state=3): >>><<< 16380 1727204186.03033: stdout chunk (state=3): >>><<< 16380 1727204186.03066: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204186.03088: _low_level_execute_command(): starting 16380 1727204186.03110: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857 `" && echo ansible-tmp-1727204186.0307405-19423-179744323415857="` echo /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857 `" ) && sleep 0' 16380 1727204186.03784: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204186.03994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204186.04104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.04171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.06181: stdout chunk (state=3): >>>ansible-tmp-1727204186.0307405-19423-179744323415857=/root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857 <<< 16380 1727204186.06303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.06355: stderr chunk (state=3): >>><<< 16380 1727204186.06358: stdout chunk (state=3): >>><<< 16380 1727204186.06374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204186.0307405-19423-179744323415857=/root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204186.06405: variable 'ansible_module_compression' from source: unknown 16380 1727204186.06457: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 16380 1727204186.06512: variable 'ansible_facts' from source: unknown 16380 1727204186.06658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py 16380 1727204186.06783: Sending initial data 16380 1727204186.06787: Sent initial data (156 bytes) 16380 1727204186.07251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204186.07260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204186.07262: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.07265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204186.07267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204186.07269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.07321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204186.07327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.07366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.08998: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204186.09049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204186.09095: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpvzq2gjkl /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py <<< 16380 1727204186.09097: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py" <<< 16380 1727204186.09156: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpvzq2gjkl" to remote "/root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py" <<< 16380 1727204186.11206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.11276: stderr chunk (state=3): >>><<< 16380 1727204186.11285: stdout chunk (state=3): >>><<< 16380 1727204186.11323: done transferring module to remote 16380 1727204186.11327: _low_level_execute_command(): starting 16380 1727204186.11333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/ /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py && sleep 0' 16380 1727204186.11932: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.11935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.11940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.11943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.12041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.12071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.13994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.14048: stderr chunk (state=3): >>><<< 16380 1727204186.14052: stdout chunk (state=3): >>><<< 16380 1727204186.14066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204186.14069: _low_level_execute_command(): starting 16380 1727204186.14075: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/AnsiballZ_systemd.py && sleep 0' 16380 1727204186.14534: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204186.14538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204186.14540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204186.14543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.14545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.14595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204186.14614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.14653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.48173: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4390912", "MemoryAvailable": "infinity", "CPUUsageNSec": "1196499000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 16380 1727204186.48208: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 16380 1727204186.48225: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 16380 1727204186.50327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204186.50395: stderr chunk (state=3): >>><<< 16380 1727204186.50398: stdout chunk (state=3): >>><<< 16380 1727204186.50416: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4390912", "MemoryAvailable": "infinity", "CPUUsageNSec": "1196499000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204186.50591: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204186.50610: _low_level_execute_command(): starting 16380 1727204186.50615: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204186.0307405-19423-179744323415857/ > /dev/null 2>&1 && sleep 0' 16380 1727204186.51105: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.51110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204186.51113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204186.51115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.51121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.51177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204186.51183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204186.51185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.51224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.53204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.53256: stderr chunk (state=3): >>><<< 16380 1727204186.53259: stdout chunk (state=3): >>><<< 16380 1727204186.53273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204186.53284: handler run complete 16380 1727204186.53337: attempt loop complete, returning result 16380 1727204186.53341: _execute() done 16380 1727204186.53344: dumping result to json 16380 1727204186.53358: done dumping result, returning 16380 1727204186.53368: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-749c-b6eb-000000000066] 16380 1727204186.53373: sending task result for task 12b410aa-8751-749c-b6eb-000000000066 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204186.53667: no more pending results, returning what we have 16380 1727204186.53671: results queue empty 16380 1727204186.53673: checking for any_errors_fatal 16380 1727204186.53680: done checking for any_errors_fatal 16380 1727204186.53681: checking for max_fail_percentage 16380 1727204186.53683: done checking for max_fail_percentage 16380 1727204186.53684: checking to see if all hosts have failed and the running result is not ok 16380 1727204186.53685: done checking to see if all hosts have failed 16380 1727204186.53686: getting the remaining hosts for this loop 16380 1727204186.53688: done getting the remaining hosts for this loop 16380 1727204186.53697: getting the next task for host managed-node2 16380 1727204186.53713: done getting next task for host managed-node2 16380 1727204186.53720: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16380 1727204186.53722: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204186.53732: done sending task result for task 12b410aa-8751-749c-b6eb-000000000066 16380 1727204186.53734: WORKER PROCESS EXITING 16380 1727204186.53743: getting variables 16380 1727204186.53745: in VariableManager get_vars() 16380 1727204186.53785: Calling all_inventory to load vars for managed-node2 16380 1727204186.53788: Calling groups_inventory to load vars for managed-node2 16380 1727204186.53794: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204186.53805: Calling all_plugins_play to load vars for managed-node2 16380 1727204186.53808: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204186.53824: Calling groups_plugins_play to load vars for managed-node2 16380 1727204186.55074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204186.56652: done with get_vars() 16380 1727204186.56675: done getting variables 16380 1727204186.56727: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.680) 0:00:47.674 ***** 16380 1727204186.56756: entering _queue_task() for managed-node2/service 16380 1727204186.57004: worker is 1 (out of 1 available) 16380 1727204186.57018: exiting _queue_task() for managed-node2/service 16380 1727204186.57030: done queuing things up, now waiting for results queue to drain 16380 1727204186.57032: waiting for pending results... 16380 1727204186.57224: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 16380 1727204186.57315: in run() - task 12b410aa-8751-749c-b6eb-000000000067 16380 1727204186.57334: variable 'ansible_search_path' from source: unknown 16380 1727204186.57338: variable 'ansible_search_path' from source: unknown 16380 1727204186.57371: calling self._execute() 16380 1727204186.57463: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204186.57467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204186.57484: variable 'omit' from source: magic vars 16380 1727204186.57810: variable 'ansible_distribution_major_version' from source: facts 16380 1727204186.57824: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204186.57929: variable 'network_provider' from source: set_fact 16380 1727204186.57935: Evaluated conditional (network_provider == "nm"): True 16380 1727204186.58013: variable '__network_wpa_supplicant_required' from source: role '' defaults 16380 1727204186.58094: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 16380 1727204186.58250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204186.60144: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204186.60208: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204186.60234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204186.60265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204186.60288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204186.60363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204186.60386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204186.60410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204186.60449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204186.60462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204186.60504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204186.60525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204186.60551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204186.60583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204186.60597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204186.60633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204186.60668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204186.60679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204186.60712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204186.60726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204186.60846: variable 'network_connections' from source: play vars 16380 1727204186.60873: variable 'profile' from source: play vars 16380 1727204186.60920: variable 'profile' from source: play vars 16380 1727204186.60924: variable 'interface' from source: set_fact 16380 1727204186.60986: variable 'interface' from source: set_fact 16380 1727204186.61049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 16380 1727204186.61182: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 16380 1727204186.61222: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 16380 1727204186.61246: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 16380 1727204186.61271: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 16380 1727204186.61314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 16380 1727204186.61335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 16380 1727204186.61356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204186.61377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 16380 1727204186.61433: variable '__network_wireless_connections_defined' from source: role '' defaults 16380 1727204186.61626: variable 'network_connections' from source: play vars 16380 1727204186.61632: variable 'profile' from source: play vars 16380 1727204186.61684: variable 'profile' from source: play vars 16380 1727204186.61688: variable 'interface' from source: set_fact 16380 1727204186.61740: variable 'interface' from source: set_fact 16380 1727204186.61770: Evaluated conditional (__network_wpa_supplicant_required): False 16380 1727204186.61773: when evaluation is False, skipping this task 16380 1727204186.61776: _execute() done 16380 1727204186.61787: dumping result to json 16380 1727204186.61793: done dumping result, returning 16380 1727204186.61796: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-749c-b6eb-000000000067] 16380 1727204186.61843: sending task result for task 12b410aa-8751-749c-b6eb-000000000067 16380 1727204186.61915: done sending task result for task 12b410aa-8751-749c-b6eb-000000000067 16380 1727204186.61920: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 16380 1727204186.61998: no more pending results, returning what we have 16380 1727204186.62002: results queue empty 16380 1727204186.62003: checking for any_errors_fatal 16380 1727204186.62026: done checking for any_errors_fatal 16380 1727204186.62027: checking for max_fail_percentage 16380 1727204186.62031: done checking for max_fail_percentage 16380 1727204186.62032: checking to see if all hosts have failed and the running result is not ok 16380 1727204186.62033: done checking to see if all hosts have failed 16380 1727204186.62034: getting the remaining hosts for this loop 16380 1727204186.62035: done getting the remaining hosts for this loop 16380 1727204186.62039: getting the next task for host managed-node2 16380 1727204186.62046: done getting next task for host managed-node2 16380 1727204186.62050: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 16380 1727204186.62052: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204186.62066: getting variables 16380 1727204186.62068: in VariableManager get_vars() 16380 1727204186.62107: Calling all_inventory to load vars for managed-node2 16380 1727204186.62111: Calling groups_inventory to load vars for managed-node2 16380 1727204186.62113: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204186.62126: Calling all_plugins_play to load vars for managed-node2 16380 1727204186.62129: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204186.62133: Calling groups_plugins_play to load vars for managed-node2 16380 1727204186.63465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204186.65043: done with get_vars() 16380 1727204186.65064: done getting variables 16380 1727204186.65120: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.083) 0:00:47.758 ***** 16380 1727204186.65144: entering _queue_task() for managed-node2/service 16380 1727204186.65406: worker is 1 (out of 1 available) 16380 1727204186.65420: exiting _queue_task() for managed-node2/service 16380 1727204186.65432: done queuing things up, now waiting for results queue to drain 16380 1727204186.65434: waiting for pending results... 16380 1727204186.65636: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 16380 1727204186.65730: in run() - task 12b410aa-8751-749c-b6eb-000000000068 16380 1727204186.65744: variable 'ansible_search_path' from source: unknown 16380 1727204186.65748: variable 'ansible_search_path' from source: unknown 16380 1727204186.65783: calling self._execute() 16380 1727204186.65879: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204186.65893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204186.65995: variable 'omit' from source: magic vars 16380 1727204186.66236: variable 'ansible_distribution_major_version' from source: facts 16380 1727204186.66247: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204186.66353: variable 'network_provider' from source: set_fact 16380 1727204186.66359: Evaluated conditional (network_provider == "initscripts"): False 16380 1727204186.66363: when evaluation is False, skipping this task 16380 1727204186.66366: _execute() done 16380 1727204186.66371: dumping result to json 16380 1727204186.66375: done dumping result, returning 16380 1727204186.66383: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-749c-b6eb-000000000068] 16380 1727204186.66390: sending task result for task 12b410aa-8751-749c-b6eb-000000000068 16380 1727204186.66479: done sending task result for task 12b410aa-8751-749c-b6eb-000000000068 16380 1727204186.66483: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 16380 1727204186.66530: no more pending results, returning what we have 16380 1727204186.66534: results queue empty 16380 1727204186.66535: checking for any_errors_fatal 16380 1727204186.66546: done checking for any_errors_fatal 16380 1727204186.66547: checking for max_fail_percentage 16380 1727204186.66549: done checking for max_fail_percentage 16380 1727204186.66551: checking to see if all hosts have failed and the running result is not ok 16380 1727204186.66552: done checking to see if all hosts have failed 16380 1727204186.66553: getting the remaining hosts for this loop 16380 1727204186.66555: done getting the remaining hosts for this loop 16380 1727204186.66558: getting the next task for host managed-node2 16380 1727204186.66565: done getting next task for host managed-node2 16380 1727204186.66569: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16380 1727204186.66572: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204186.66588: getting variables 16380 1727204186.66592: in VariableManager get_vars() 16380 1727204186.66629: Calling all_inventory to load vars for managed-node2 16380 1727204186.66633: Calling groups_inventory to load vars for managed-node2 16380 1727204186.66635: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204186.66646: Calling all_plugins_play to load vars for managed-node2 16380 1727204186.66649: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204186.66653: Calling groups_plugins_play to load vars for managed-node2 16380 1727204186.71495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204186.73062: done with get_vars() 16380 1727204186.73085: done getting variables 16380 1727204186.73134: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.080) 0:00:47.838 ***** 16380 1727204186.73156: entering _queue_task() for managed-node2/copy 16380 1727204186.73436: worker is 1 (out of 1 available) 16380 1727204186.73451: exiting _queue_task() for managed-node2/copy 16380 1727204186.73464: done queuing things up, now waiting for results queue to drain 16380 1727204186.73466: waiting for pending results... 16380 1727204186.73664: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 16380 1727204186.73756: in run() - task 12b410aa-8751-749c-b6eb-000000000069 16380 1727204186.73771: variable 'ansible_search_path' from source: unknown 16380 1727204186.73775: variable 'ansible_search_path' from source: unknown 16380 1727204186.73810: calling self._execute() 16380 1727204186.73901: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204186.73911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204186.73925: variable 'omit' from source: magic vars 16380 1727204186.74250: variable 'ansible_distribution_major_version' from source: facts 16380 1727204186.74259: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204186.74364: variable 'network_provider' from source: set_fact 16380 1727204186.74375: Evaluated conditional (network_provider == "initscripts"): False 16380 1727204186.74379: when evaluation is False, skipping this task 16380 1727204186.74382: _execute() done 16380 1727204186.74385: dumping result to json 16380 1727204186.74390: done dumping result, returning 16380 1727204186.74400: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-749c-b6eb-000000000069] 16380 1727204186.74406: sending task result for task 12b410aa-8751-749c-b6eb-000000000069 16380 1727204186.74504: done sending task result for task 12b410aa-8751-749c-b6eb-000000000069 16380 1727204186.74507: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 16380 1727204186.74559: no more pending results, returning what we have 16380 1727204186.74563: results queue empty 16380 1727204186.74564: checking for any_errors_fatal 16380 1727204186.74573: done checking for any_errors_fatal 16380 1727204186.74574: checking for max_fail_percentage 16380 1727204186.74576: done checking for max_fail_percentage 16380 1727204186.74577: checking to see if all hosts have failed and the running result is not ok 16380 1727204186.74578: done checking to see if all hosts have failed 16380 1727204186.74579: getting the remaining hosts for this loop 16380 1727204186.74581: done getting the remaining hosts for this loop 16380 1727204186.74585: getting the next task for host managed-node2 16380 1727204186.74593: done getting next task for host managed-node2 16380 1727204186.74597: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16380 1727204186.74599: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204186.74615: getting variables 16380 1727204186.74618: in VariableManager get_vars() 16380 1727204186.74654: Calling all_inventory to load vars for managed-node2 16380 1727204186.74657: Calling groups_inventory to load vars for managed-node2 16380 1727204186.74659: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204186.74669: Calling all_plugins_play to load vars for managed-node2 16380 1727204186.74672: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204186.74676: Calling groups_plugins_play to load vars for managed-node2 16380 1727204186.75884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204186.77473: done with get_vars() 16380 1727204186.77496: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.044) 0:00:47.882 ***** 16380 1727204186.77567: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16380 1727204186.77799: worker is 1 (out of 1 available) 16380 1727204186.77812: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 16380 1727204186.77829: done queuing things up, now waiting for results queue to drain 16380 1727204186.77831: waiting for pending results... 16380 1727204186.78024: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 16380 1727204186.78108: in run() - task 12b410aa-8751-749c-b6eb-00000000006a 16380 1727204186.78123: variable 'ansible_search_path' from source: unknown 16380 1727204186.78127: variable 'ansible_search_path' from source: unknown 16380 1727204186.78158: calling self._execute() 16380 1727204186.78247: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204186.78253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204186.78263: variable 'omit' from source: magic vars 16380 1727204186.78595: variable 'ansible_distribution_major_version' from source: facts 16380 1727204186.78607: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204186.78620: variable 'omit' from source: magic vars 16380 1727204186.78656: variable 'omit' from source: magic vars 16380 1727204186.78798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 16380 1727204186.80752: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 16380 1727204186.80809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 16380 1727204186.80849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 16380 1727204186.80879: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 16380 1727204186.80908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 16380 1727204186.80969: variable 'network_provider' from source: set_fact 16380 1727204186.81082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 16380 1727204186.81108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 16380 1727204186.81139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 16380 1727204186.81169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 16380 1727204186.81182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 16380 1727204186.81248: variable 'omit' from source: magic vars 16380 1727204186.81340: variable 'omit' from source: magic vars 16380 1727204186.81426: variable 'network_connections' from source: play vars 16380 1727204186.81437: variable 'profile' from source: play vars 16380 1727204186.81494: variable 'profile' from source: play vars 16380 1727204186.81498: variable 'interface' from source: set_fact 16380 1727204186.81550: variable 'interface' from source: set_fact 16380 1727204186.81674: variable 'omit' from source: magic vars 16380 1727204186.81684: variable '__lsr_ansible_managed' from source: task vars 16380 1727204186.81736: variable '__lsr_ansible_managed' from source: task vars 16380 1727204186.81884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 16380 1727204186.82065: Loaded config def from plugin (lookup/template) 16380 1727204186.82070: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 16380 1727204186.82100: File lookup term: get_ansible_managed.j2 16380 1727204186.82104: variable 'ansible_search_path' from source: unknown 16380 1727204186.82108: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 16380 1727204186.82127: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 16380 1727204186.82137: variable 'ansible_search_path' from source: unknown 16380 1727204186.89300: variable 'ansible_managed' from source: unknown 16380 1727204186.89517: variable 'omit' from source: magic vars 16380 1727204186.89564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204186.89602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204186.89632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204186.89734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204186.89737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204186.89741: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204186.89743: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204186.89747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204186.89874: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204186.89894: Set connection var ansible_shell_executable to /bin/sh 16380 1727204186.89908: Set connection var ansible_connection to ssh 16380 1727204186.89921: Set connection var ansible_shell_type to sh 16380 1727204186.89934: Set connection var ansible_pipelining to False 16380 1727204186.89958: Set connection var ansible_timeout to 10 16380 1727204186.89997: variable 'ansible_shell_executable' from source: unknown 16380 1727204186.90007: variable 'ansible_connection' from source: unknown 16380 1727204186.90014: variable 'ansible_module_compression' from source: unknown 16380 1727204186.90061: variable 'ansible_shell_type' from source: unknown 16380 1727204186.90064: variable 'ansible_shell_executable' from source: unknown 16380 1727204186.90067: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204186.90069: variable 'ansible_pipelining' from source: unknown 16380 1727204186.90071: variable 'ansible_timeout' from source: unknown 16380 1727204186.90073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204186.90237: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204186.90279: variable 'omit' from source: magic vars 16380 1727204186.90282: starting attempt loop 16380 1727204186.90287: running the handler 16380 1727204186.90388: _low_level_execute_command(): starting 16380 1727204186.90394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204186.90964: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204186.90981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.91032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204186.91051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.91094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.92900: stdout chunk (state=3): >>>/root <<< 16380 1727204186.93081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.93097: stderr chunk (state=3): >>><<< 16380 1727204186.93106: stdout chunk (state=3): >>><<< 16380 1727204186.93233: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204186.93236: _low_level_execute_command(): starting 16380 1727204186.93240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982 `" && echo ansible-tmp-1727204186.9313493-19442-154151514425982="` echo /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982 `" ) && sleep 0' 16380 1727204186.93817: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204186.93846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204186.93862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204186.93879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204186.93911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.93952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204186.94003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.94064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204186.94088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204186.94105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.94183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.96290: stdout chunk (state=3): >>>ansible-tmp-1727204186.9313493-19442-154151514425982=/root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982 <<< 16380 1727204186.96486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204186.96491: stdout chunk (state=3): >>><<< 16380 1727204186.96494: stderr chunk (state=3): >>><<< 16380 1727204186.96595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204186.9313493-19442-154151514425982=/root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204186.96599: variable 'ansible_module_compression' from source: unknown 16380 1727204186.96636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 16380 1727204186.96684: variable 'ansible_facts' from source: unknown 16380 1727204186.96814: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py 16380 1727204186.97018: Sending initial data 16380 1727204186.97021: Sent initial data (168 bytes) 16380 1727204186.97638: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204186.97704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204186.97768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204186.97785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204186.97808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204186.97879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204186.99740: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204186.99746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204186.99749: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpmmlyz0hn /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py <<< 16380 1727204186.99751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py" <<< 16380 1727204186.99784: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpmmlyz0hn" to remote "/root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py" <<< 16380 1727204187.01411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.01654: stderr chunk (state=3): >>><<< 16380 1727204187.01657: stdout chunk (state=3): >>><<< 16380 1727204187.01660: done transferring module to remote 16380 1727204187.01664: _low_level_execute_command(): starting 16380 1727204187.01666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/ /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py && sleep 0' 16380 1727204187.02107: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.02129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204187.02147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.02196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204187.02199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.02243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.04251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.04287: stderr chunk (state=3): >>><<< 16380 1727204187.04292: stdout chunk (state=3): >>><<< 16380 1727204187.04302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204187.04305: _low_level_execute_command(): starting 16380 1727204187.04311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/AnsiballZ_network_connections.py && sleep 0' 16380 1727204187.04776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.04780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.04783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204187.04785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.04850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204187.04853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.04896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.35715: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 16380 1727204187.35731: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x_ieu8p4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x_ieu8p4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/528bfc52-b65b-476e-8410-3e9f9aa7eced: error=unknown <<< 16380 1727204187.35899: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 16380 1727204187.37893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204187.38065: stderr chunk (state=3): >>><<< 16380 1727204187.38069: stdout chunk (state=3): >>><<< 16380 1727204187.38073: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x_ieu8p4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x_ieu8p4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/528bfc52-b65b-476e-8410-3e9f9aa7eced: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204187.38076: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204187.38079: _low_level_execute_command(): starting 16380 1727204187.38095: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204186.9313493-19442-154151514425982/ > /dev/null 2>&1 && sleep 0' 16380 1727204187.38640: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204187.38678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.38682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204187.38685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.38735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204187.38739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.38788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.40806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.40876: stderr chunk (state=3): >>><<< 16380 1727204187.40881: stdout chunk (state=3): >>><<< 16380 1727204187.40892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204187.40900: handler run complete 16380 1727204187.40927: attempt loop complete, returning result 16380 1727204187.40930: _execute() done 16380 1727204187.40933: dumping result to json 16380 1727204187.40939: done dumping result, returning 16380 1727204187.40970: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-749c-b6eb-00000000006a] 16380 1727204187.40973: sending task result for task 12b410aa-8751-749c-b6eb-00000000006a 16380 1727204187.41092: done sending task result for task 12b410aa-8751-749c-b6eb-00000000006a 16380 1727204187.41095: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 16380 1727204187.41214: no more pending results, returning what we have 16380 1727204187.41219: results queue empty 16380 1727204187.41220: checking for any_errors_fatal 16380 1727204187.41228: done checking for any_errors_fatal 16380 1727204187.41228: checking for max_fail_percentage 16380 1727204187.41230: done checking for max_fail_percentage 16380 1727204187.41231: checking to see if all hosts have failed and the running result is not ok 16380 1727204187.41232: done checking to see if all hosts have failed 16380 1727204187.41233: getting the remaining hosts for this loop 16380 1727204187.41235: done getting the remaining hosts for this loop 16380 1727204187.41239: getting the next task for host managed-node2 16380 1727204187.41245: done getting next task for host managed-node2 16380 1727204187.41249: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 16380 1727204187.41251: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204187.41263: getting variables 16380 1727204187.41265: in VariableManager get_vars() 16380 1727204187.41339: Calling all_inventory to load vars for managed-node2 16380 1727204187.41343: Calling groups_inventory to load vars for managed-node2 16380 1727204187.41345: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204187.41374: Calling all_plugins_play to load vars for managed-node2 16380 1727204187.41378: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204187.41382: Calling groups_plugins_play to load vars for managed-node2 16380 1727204187.43231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204187.45048: done with get_vars() 16380 1727204187.45078: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.675) 0:00:48.558 ***** 16380 1727204187.45155: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16380 1727204187.45471: worker is 1 (out of 1 available) 16380 1727204187.45486: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 16380 1727204187.45501: done queuing things up, now waiting for results queue to drain 16380 1727204187.45503: waiting for pending results... 16380 1727204187.45744: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 16380 1727204187.45842: in run() - task 12b410aa-8751-749c-b6eb-00000000006b 16380 1727204187.45863: variable 'ansible_search_path' from source: unknown 16380 1727204187.45869: variable 'ansible_search_path' from source: unknown 16380 1727204187.45903: calling self._execute() 16380 1727204187.46016: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.46023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.46031: variable 'omit' from source: magic vars 16380 1727204187.46462: variable 'ansible_distribution_major_version' from source: facts 16380 1727204187.46466: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204187.46570: variable 'network_state' from source: role '' defaults 16380 1727204187.46581: Evaluated conditional (network_state != {}): False 16380 1727204187.46584: when evaluation is False, skipping this task 16380 1727204187.46587: _execute() done 16380 1727204187.46593: dumping result to json 16380 1727204187.46598: done dumping result, returning 16380 1727204187.46606: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-749c-b6eb-00000000006b] 16380 1727204187.46612: sending task result for task 12b410aa-8751-749c-b6eb-00000000006b skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 16380 1727204187.46782: no more pending results, returning what we have 16380 1727204187.46788: results queue empty 16380 1727204187.46790: checking for any_errors_fatal 16380 1727204187.46835: done checking for any_errors_fatal 16380 1727204187.46837: checking for max_fail_percentage 16380 1727204187.46839: done checking for max_fail_percentage 16380 1727204187.46840: checking to see if all hosts have failed and the running result is not ok 16380 1727204187.46841: done checking to see if all hosts have failed 16380 1727204187.46842: getting the remaining hosts for this loop 16380 1727204187.46844: done getting the remaining hosts for this loop 16380 1727204187.46848: getting the next task for host managed-node2 16380 1727204187.46855: done getting next task for host managed-node2 16380 1727204187.46859: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16380 1727204187.46861: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204187.46872: done sending task result for task 12b410aa-8751-749c-b6eb-00000000006b 16380 1727204187.46874: WORKER PROCESS EXITING 16380 1727204187.46886: getting variables 16380 1727204187.46887: in VariableManager get_vars() 16380 1727204187.46938: Calling all_inventory to load vars for managed-node2 16380 1727204187.46941: Calling groups_inventory to load vars for managed-node2 16380 1727204187.46944: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204187.46954: Calling all_plugins_play to load vars for managed-node2 16380 1727204187.46957: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204187.46961: Calling groups_plugins_play to load vars for managed-node2 16380 1727204187.48695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204187.50755: done with get_vars() 16380 1727204187.50782: done getting variables 16380 1727204187.50840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.057) 0:00:48.615 ***** 16380 1727204187.50881: entering _queue_task() for managed-node2/debug 16380 1727204187.51161: worker is 1 (out of 1 available) 16380 1727204187.51177: exiting _queue_task() for managed-node2/debug 16380 1727204187.51192: done queuing things up, now waiting for results queue to drain 16380 1727204187.51194: waiting for pending results... 16380 1727204187.51431: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 16380 1727204187.51525: in run() - task 12b410aa-8751-749c-b6eb-00000000006c 16380 1727204187.51538: variable 'ansible_search_path' from source: unknown 16380 1727204187.51542: variable 'ansible_search_path' from source: unknown 16380 1727204187.51580: calling self._execute() 16380 1727204187.51669: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.51675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.51693: variable 'omit' from source: magic vars 16380 1727204187.52024: variable 'ansible_distribution_major_version' from source: facts 16380 1727204187.52034: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204187.52038: variable 'omit' from source: magic vars 16380 1727204187.52072: variable 'omit' from source: magic vars 16380 1727204187.52106: variable 'omit' from source: magic vars 16380 1727204187.52147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204187.52178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204187.52198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204187.52214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204187.52227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204187.52258: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204187.52262: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.52266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.52351: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204187.52360: Set connection var ansible_shell_executable to /bin/sh 16380 1727204187.52365: Set connection var ansible_connection to ssh 16380 1727204187.52372: Set connection var ansible_shell_type to sh 16380 1727204187.52379: Set connection var ansible_pipelining to False 16380 1727204187.52387: Set connection var ansible_timeout to 10 16380 1727204187.52409: variable 'ansible_shell_executable' from source: unknown 16380 1727204187.52412: variable 'ansible_connection' from source: unknown 16380 1727204187.52415: variable 'ansible_module_compression' from source: unknown 16380 1727204187.52421: variable 'ansible_shell_type' from source: unknown 16380 1727204187.52424: variable 'ansible_shell_executable' from source: unknown 16380 1727204187.52426: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.52431: variable 'ansible_pipelining' from source: unknown 16380 1727204187.52433: variable 'ansible_timeout' from source: unknown 16380 1727204187.52439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.52561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204187.52573: variable 'omit' from source: magic vars 16380 1727204187.52577: starting attempt loop 16380 1727204187.52585: running the handler 16380 1727204187.52694: variable '__network_connections_result' from source: set_fact 16380 1727204187.52738: handler run complete 16380 1727204187.52756: attempt loop complete, returning result 16380 1727204187.52759: _execute() done 16380 1727204187.52762: dumping result to json 16380 1727204187.52768: done dumping result, returning 16380 1727204187.52776: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-749c-b6eb-00000000006c] 16380 1727204187.52783: sending task result for task 12b410aa-8751-749c-b6eb-00000000006c 16380 1727204187.52872: done sending task result for task 12b410aa-8751-749c-b6eb-00000000006c 16380 1727204187.52875: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 16380 1727204187.52965: no more pending results, returning what we have 16380 1727204187.52969: results queue empty 16380 1727204187.52970: checking for any_errors_fatal 16380 1727204187.52976: done checking for any_errors_fatal 16380 1727204187.52977: checking for max_fail_percentage 16380 1727204187.52979: done checking for max_fail_percentage 16380 1727204187.52980: checking to see if all hosts have failed and the running result is not ok 16380 1727204187.52981: done checking to see if all hosts have failed 16380 1727204187.52982: getting the remaining hosts for this loop 16380 1727204187.52984: done getting the remaining hosts for this loop 16380 1727204187.52988: getting the next task for host managed-node2 16380 1727204187.52997: done getting next task for host managed-node2 16380 1727204187.53003: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16380 1727204187.53006: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204187.53016: getting variables 16380 1727204187.53019: in VariableManager get_vars() 16380 1727204187.53056: Calling all_inventory to load vars for managed-node2 16380 1727204187.53059: Calling groups_inventory to load vars for managed-node2 16380 1727204187.53061: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204187.53071: Calling all_plugins_play to load vars for managed-node2 16380 1727204187.53074: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204187.53077: Calling groups_plugins_play to load vars for managed-node2 16380 1727204187.54305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204187.55887: done with get_vars() 16380 1727204187.55911: done getting variables 16380 1727204187.55962: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.051) 0:00:48.666 ***** 16380 1727204187.55987: entering _queue_task() for managed-node2/debug 16380 1727204187.56233: worker is 1 (out of 1 available) 16380 1727204187.56247: exiting _queue_task() for managed-node2/debug 16380 1727204187.56260: done queuing things up, now waiting for results queue to drain 16380 1727204187.56263: waiting for pending results... 16380 1727204187.56470: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 16380 1727204187.56556: in run() - task 12b410aa-8751-749c-b6eb-00000000006d 16380 1727204187.56569: variable 'ansible_search_path' from source: unknown 16380 1727204187.56573: variable 'ansible_search_path' from source: unknown 16380 1727204187.56614: calling self._execute() 16380 1727204187.56699: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.56705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.56722: variable 'omit' from source: magic vars 16380 1727204187.57055: variable 'ansible_distribution_major_version' from source: facts 16380 1727204187.57062: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204187.57069: variable 'omit' from source: magic vars 16380 1727204187.57102: variable 'omit' from source: magic vars 16380 1727204187.57136: variable 'omit' from source: magic vars 16380 1727204187.57177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204187.57209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204187.57228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204187.57244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204187.57261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204187.57285: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204187.57290: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.57295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.57378: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204187.57391: Set connection var ansible_shell_executable to /bin/sh 16380 1727204187.57399: Set connection var ansible_connection to ssh 16380 1727204187.57406: Set connection var ansible_shell_type to sh 16380 1727204187.57412: Set connection var ansible_pipelining to False 16380 1727204187.57423: Set connection var ansible_timeout to 10 16380 1727204187.57442: variable 'ansible_shell_executable' from source: unknown 16380 1727204187.57445: variable 'ansible_connection' from source: unknown 16380 1727204187.57448: variable 'ansible_module_compression' from source: unknown 16380 1727204187.57452: variable 'ansible_shell_type' from source: unknown 16380 1727204187.57455: variable 'ansible_shell_executable' from source: unknown 16380 1727204187.57460: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.57465: variable 'ansible_pipelining' from source: unknown 16380 1727204187.57468: variable 'ansible_timeout' from source: unknown 16380 1727204187.57479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.57604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204187.57615: variable 'omit' from source: magic vars 16380 1727204187.57699: starting attempt loop 16380 1727204187.57704: running the handler 16380 1727204187.57707: variable '__network_connections_result' from source: set_fact 16380 1727204187.57732: variable '__network_connections_result' from source: set_fact 16380 1727204187.57828: handler run complete 16380 1727204187.57848: attempt loop complete, returning result 16380 1727204187.57851: _execute() done 16380 1727204187.57854: dumping result to json 16380 1727204187.57860: done dumping result, returning 16380 1727204187.57868: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-749c-b6eb-00000000006d] 16380 1727204187.57873: sending task result for task 12b410aa-8751-749c-b6eb-00000000006d 16380 1727204187.57973: done sending task result for task 12b410aa-8751-749c-b6eb-00000000006d 16380 1727204187.57976: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 16380 1727204187.58067: no more pending results, returning what we have 16380 1727204187.58070: results queue empty 16380 1727204187.58071: checking for any_errors_fatal 16380 1727204187.58075: done checking for any_errors_fatal 16380 1727204187.58076: checking for max_fail_percentage 16380 1727204187.58078: done checking for max_fail_percentage 16380 1727204187.58079: checking to see if all hosts have failed and the running result is not ok 16380 1727204187.58080: done checking to see if all hosts have failed 16380 1727204187.58081: getting the remaining hosts for this loop 16380 1727204187.58082: done getting the remaining hosts for this loop 16380 1727204187.58085: getting the next task for host managed-node2 16380 1727204187.58093: done getting next task for host managed-node2 16380 1727204187.58097: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16380 1727204187.58099: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204187.58109: getting variables 16380 1727204187.58111: in VariableManager get_vars() 16380 1727204187.58147: Calling all_inventory to load vars for managed-node2 16380 1727204187.58150: Calling groups_inventory to load vars for managed-node2 16380 1727204187.58153: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204187.58162: Calling all_plugins_play to load vars for managed-node2 16380 1727204187.58165: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204187.58169: Calling groups_plugins_play to load vars for managed-node2 16380 1727204187.59508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204187.61080: done with get_vars() 16380 1727204187.61103: done getting variables 16380 1727204187.61156: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.051) 0:00:48.718 ***** 16380 1727204187.61184: entering _queue_task() for managed-node2/debug 16380 1727204187.61426: worker is 1 (out of 1 available) 16380 1727204187.61441: exiting _queue_task() for managed-node2/debug 16380 1727204187.61453: done queuing things up, now waiting for results queue to drain 16380 1727204187.61455: waiting for pending results... 16380 1727204187.61655: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 16380 1727204187.61745: in run() - task 12b410aa-8751-749c-b6eb-00000000006e 16380 1727204187.61759: variable 'ansible_search_path' from source: unknown 16380 1727204187.61763: variable 'ansible_search_path' from source: unknown 16380 1727204187.61800: calling self._execute() 16380 1727204187.61891: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.61896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.61909: variable 'omit' from source: magic vars 16380 1727204187.62242: variable 'ansible_distribution_major_version' from source: facts 16380 1727204187.62260: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204187.62370: variable 'network_state' from source: role '' defaults 16380 1727204187.62380: Evaluated conditional (network_state != {}): False 16380 1727204187.62383: when evaluation is False, skipping this task 16380 1727204187.62386: _execute() done 16380 1727204187.62392: dumping result to json 16380 1727204187.62399: done dumping result, returning 16380 1727204187.62408: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-749c-b6eb-00000000006e] 16380 1727204187.62413: sending task result for task 12b410aa-8751-749c-b6eb-00000000006e 16380 1727204187.62508: done sending task result for task 12b410aa-8751-749c-b6eb-00000000006e 16380 1727204187.62511: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 16380 1727204187.62564: no more pending results, returning what we have 16380 1727204187.62568: results queue empty 16380 1727204187.62569: checking for any_errors_fatal 16380 1727204187.62577: done checking for any_errors_fatal 16380 1727204187.62578: checking for max_fail_percentage 16380 1727204187.62580: done checking for max_fail_percentage 16380 1727204187.62581: checking to see if all hosts have failed and the running result is not ok 16380 1727204187.62582: done checking to see if all hosts have failed 16380 1727204187.62583: getting the remaining hosts for this loop 16380 1727204187.62585: done getting the remaining hosts for this loop 16380 1727204187.62591: getting the next task for host managed-node2 16380 1727204187.62597: done getting next task for host managed-node2 16380 1727204187.62602: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 16380 1727204187.62604: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204187.62620: getting variables 16380 1727204187.62621: in VariableManager get_vars() 16380 1727204187.62655: Calling all_inventory to load vars for managed-node2 16380 1727204187.62658: Calling groups_inventory to load vars for managed-node2 16380 1727204187.62661: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204187.62671: Calling all_plugins_play to load vars for managed-node2 16380 1727204187.62674: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204187.62677: Calling groups_plugins_play to load vars for managed-node2 16380 1727204187.63982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204187.65561: done with get_vars() 16380 1727204187.65582: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.044) 0:00:48.763 ***** 16380 1727204187.65660: entering _queue_task() for managed-node2/ping 16380 1727204187.65880: worker is 1 (out of 1 available) 16380 1727204187.65898: exiting _queue_task() for managed-node2/ping 16380 1727204187.65909: done queuing things up, now waiting for results queue to drain 16380 1727204187.65911: waiting for pending results... 16380 1727204187.66104: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 16380 1727204187.66195: in run() - task 12b410aa-8751-749c-b6eb-00000000006f 16380 1727204187.66210: variable 'ansible_search_path' from source: unknown 16380 1727204187.66214: variable 'ansible_search_path' from source: unknown 16380 1727204187.66253: calling self._execute() 16380 1727204187.66337: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.66342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.66355: variable 'omit' from source: magic vars 16380 1727204187.66685: variable 'ansible_distribution_major_version' from source: facts 16380 1727204187.66705: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204187.66709: variable 'omit' from source: magic vars 16380 1727204187.66744: variable 'omit' from source: magic vars 16380 1727204187.66773: variable 'omit' from source: magic vars 16380 1727204187.66814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204187.66848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204187.66864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204187.66881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204187.66893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204187.66927: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204187.66931: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.66933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.67016: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204187.67028: Set connection var ansible_shell_executable to /bin/sh 16380 1727204187.67035: Set connection var ansible_connection to ssh 16380 1727204187.67041: Set connection var ansible_shell_type to sh 16380 1727204187.67049: Set connection var ansible_pipelining to False 16380 1727204187.67058: Set connection var ansible_timeout to 10 16380 1727204187.67081: variable 'ansible_shell_executable' from source: unknown 16380 1727204187.67085: variable 'ansible_connection' from source: unknown 16380 1727204187.67088: variable 'ansible_module_compression' from source: unknown 16380 1727204187.67094: variable 'ansible_shell_type' from source: unknown 16380 1727204187.67097: variable 'ansible_shell_executable' from source: unknown 16380 1727204187.67101: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204187.67106: variable 'ansible_pipelining' from source: unknown 16380 1727204187.67110: variable 'ansible_timeout' from source: unknown 16380 1727204187.67116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204187.67299: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204187.67309: variable 'omit' from source: magic vars 16380 1727204187.67315: starting attempt loop 16380 1727204187.67318: running the handler 16380 1727204187.67336: _low_level_execute_command(): starting 16380 1727204187.67342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204187.67901: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.67905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.67908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.67911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.67970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204187.67974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204187.67979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.68026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.69825: stdout chunk (state=3): >>>/root <<< 16380 1727204187.69935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.69992: stderr chunk (state=3): >>><<< 16380 1727204187.69996: stdout chunk (state=3): >>><<< 16380 1727204187.70017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204187.70031: _low_level_execute_command(): starting 16380 1727204187.70037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476 `" && echo ansible-tmp-1727204187.7001665-19469-91805392337476="` echo /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476 `" ) && sleep 0' 16380 1727204187.70500: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.70504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204187.70506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204187.70521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204187.70524: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.70567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204187.70573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.70616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.72736: stdout chunk (state=3): >>>ansible-tmp-1727204187.7001665-19469-91805392337476=/root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476 <<< 16380 1727204187.72858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.72904: stderr chunk (state=3): >>><<< 16380 1727204187.72914: stdout chunk (state=3): >>><<< 16380 1727204187.72928: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.7001665-19469-91805392337476=/root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204187.72966: variable 'ansible_module_compression' from source: unknown 16380 1727204187.73003: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 16380 1727204187.73040: variable 'ansible_facts' from source: unknown 16380 1727204187.73099: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py 16380 1727204187.73210: Sending initial data 16380 1727204187.73214: Sent initial data (152 bytes) 16380 1727204187.73677: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.73681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204187.73683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204187.73687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.73743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204187.73749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.73788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.75470: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 16380 1727204187.75477: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204187.75506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204187.75550: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpmo0z8oyo /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py <<< 16380 1727204187.75554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py" <<< 16380 1727204187.75583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpmo0z8oyo" to remote "/root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py" <<< 16380 1727204187.76328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.76387: stderr chunk (state=3): >>><<< 16380 1727204187.76392: stdout chunk (state=3): >>><<< 16380 1727204187.76416: done transferring module to remote 16380 1727204187.76428: _low_level_execute_command(): starting 16380 1727204187.76433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/ /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py && sleep 0' 16380 1727204187.76877: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204187.76882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.76885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204187.76888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.76946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204187.76959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.76988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.78924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204187.78972: stderr chunk (state=3): >>><<< 16380 1727204187.78976: stdout chunk (state=3): >>><<< 16380 1727204187.78993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204187.78996: _low_level_execute_command(): starting 16380 1727204187.79003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/AnsiballZ_ping.py && sleep 0' 16380 1727204187.79464: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204187.79467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204187.79470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.79474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204187.79476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.79529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204187.79533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.79584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204187.97155: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 16380 1727204187.98901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204187.98905: stdout chunk (state=3): >>><<< 16380 1727204187.98908: stderr chunk (state=3): >>><<< 16380 1727204187.98911: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204187.98914: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204187.98916: _low_level_execute_command(): starting 16380 1727204187.98921: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.7001665-19469-91805392337476/ > /dev/null 2>&1 && sleep 0' 16380 1727204187.99613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204187.99640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204187.99659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204187.99681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204187.99832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204188.01858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204188.01933: stderr chunk (state=3): >>><<< 16380 1727204188.01943: stdout chunk (state=3): >>><<< 16380 1727204188.01968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204188.01986: handler run complete 16380 1727204188.02195: attempt loop complete, returning result 16380 1727204188.02198: _execute() done 16380 1727204188.02201: dumping result to json 16380 1727204188.02207: done dumping result, returning 16380 1727204188.02209: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-749c-b6eb-00000000006f] 16380 1727204188.02212: sending task result for task 12b410aa-8751-749c-b6eb-00000000006f 16380 1727204188.02286: done sending task result for task 12b410aa-8751-749c-b6eb-00000000006f 16380 1727204188.02292: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 16380 1727204188.02377: no more pending results, returning what we have 16380 1727204188.02380: results queue empty 16380 1727204188.02381: checking for any_errors_fatal 16380 1727204188.02387: done checking for any_errors_fatal 16380 1727204188.02388: checking for max_fail_percentage 16380 1727204188.02392: done checking for max_fail_percentage 16380 1727204188.02393: checking to see if all hosts have failed and the running result is not ok 16380 1727204188.02394: done checking to see if all hosts have failed 16380 1727204188.02395: getting the remaining hosts for this loop 16380 1727204188.02397: done getting the remaining hosts for this loop 16380 1727204188.02401: getting the next task for host managed-node2 16380 1727204188.02408: done getting next task for host managed-node2 16380 1727204188.02411: ^ task is: TASK: meta (role_complete) 16380 1727204188.02413: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204188.02424: getting variables 16380 1727204188.02426: in VariableManager get_vars() 16380 1727204188.02464: Calling all_inventory to load vars for managed-node2 16380 1727204188.02468: Calling groups_inventory to load vars for managed-node2 16380 1727204188.02470: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204188.02480: Calling all_plugins_play to load vars for managed-node2 16380 1727204188.02483: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204188.02486: Calling groups_plugins_play to load vars for managed-node2 16380 1727204188.04697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204188.07664: done with get_vars() 16380 1727204188.07710: done getting variables 16380 1727204188.07816: done queuing things up, now waiting for results queue to drain 16380 1727204188.07821: results queue empty 16380 1727204188.07823: checking for any_errors_fatal 16380 1727204188.07827: done checking for any_errors_fatal 16380 1727204188.07828: checking for max_fail_percentage 16380 1727204188.07829: done checking for max_fail_percentage 16380 1727204188.07831: checking to see if all hosts have failed and the running result is not ok 16380 1727204188.07831: done checking to see if all hosts have failed 16380 1727204188.07832: getting the remaining hosts for this loop 16380 1727204188.07834: done getting the remaining hosts for this loop 16380 1727204188.07837: getting the next task for host managed-node2 16380 1727204188.07841: done getting next task for host managed-node2 16380 1727204188.07843: ^ task is: TASK: meta (flush_handlers) 16380 1727204188.07845: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204188.07849: getting variables 16380 1727204188.07850: in VariableManager get_vars() 16380 1727204188.07866: Calling all_inventory to load vars for managed-node2 16380 1727204188.07869: Calling groups_inventory to load vars for managed-node2 16380 1727204188.07872: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204188.07878: Calling all_plugins_play to load vars for managed-node2 16380 1727204188.07882: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204188.07886: Calling groups_plugins_play to load vars for managed-node2 16380 1727204188.09976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204188.13308: done with get_vars() 16380 1727204188.13355: done getting variables 16380 1727204188.13430: in VariableManager get_vars() 16380 1727204188.13445: Calling all_inventory to load vars for managed-node2 16380 1727204188.13449: Calling groups_inventory to load vars for managed-node2 16380 1727204188.13452: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204188.13458: Calling all_plugins_play to load vars for managed-node2 16380 1727204188.13462: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204188.13466: Calling groups_plugins_play to load vars for managed-node2 16380 1727204188.15977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204188.18790: done with get_vars() 16380 1727204188.18836: done queuing things up, now waiting for results queue to drain 16380 1727204188.18839: results queue empty 16380 1727204188.18840: checking for any_errors_fatal 16380 1727204188.18841: done checking for any_errors_fatal 16380 1727204188.18842: checking for max_fail_percentage 16380 1727204188.18844: done checking for max_fail_percentage 16380 1727204188.18844: checking to see if all hosts have failed and the running result is not ok 16380 1727204188.18845: done checking to see if all hosts have failed 16380 1727204188.18846: getting the remaining hosts for this loop 16380 1727204188.18847: done getting the remaining hosts for this loop 16380 1727204188.18851: getting the next task for host managed-node2 16380 1727204188.18856: done getting next task for host managed-node2 16380 1727204188.18857: ^ task is: TASK: meta (flush_handlers) 16380 1727204188.18859: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204188.18862: getting variables 16380 1727204188.18863: in VariableManager get_vars() 16380 1727204188.18881: Calling all_inventory to load vars for managed-node2 16380 1727204188.18884: Calling groups_inventory to load vars for managed-node2 16380 1727204188.18887: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204188.18897: Calling all_plugins_play to load vars for managed-node2 16380 1727204188.18899: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204188.18903: Calling groups_plugins_play to load vars for managed-node2 16380 1727204188.20960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204188.23938: done with get_vars() 16380 1727204188.23976: done getting variables 16380 1727204188.24048: in VariableManager get_vars() 16380 1727204188.24064: Calling all_inventory to load vars for managed-node2 16380 1727204188.24067: Calling groups_inventory to load vars for managed-node2 16380 1727204188.24070: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204188.24076: Calling all_plugins_play to load vars for managed-node2 16380 1727204188.24079: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204188.24083: Calling groups_plugins_play to load vars for managed-node2 16380 1727204188.26207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204188.29258: done with get_vars() 16380 1727204188.29304: done queuing things up, now waiting for results queue to drain 16380 1727204188.29306: results queue empty 16380 1727204188.29307: checking for any_errors_fatal 16380 1727204188.29309: done checking for any_errors_fatal 16380 1727204188.29310: checking for max_fail_percentage 16380 1727204188.29312: done checking for max_fail_percentage 16380 1727204188.29313: checking to see if all hosts have failed and the running result is not ok 16380 1727204188.29314: done checking to see if all hosts have failed 16380 1727204188.29315: getting the remaining hosts for this loop 16380 1727204188.29316: done getting the remaining hosts for this loop 16380 1727204188.29319: getting the next task for host managed-node2 16380 1727204188.29323: done getting next task for host managed-node2 16380 1727204188.29325: ^ task is: None 16380 1727204188.29326: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204188.29328: done queuing things up, now waiting for results queue to drain 16380 1727204188.29329: results queue empty 16380 1727204188.29330: checking for any_errors_fatal 16380 1727204188.29331: done checking for any_errors_fatal 16380 1727204188.29332: checking for max_fail_percentage 16380 1727204188.29333: done checking for max_fail_percentage 16380 1727204188.29341: checking to see if all hosts have failed and the running result is not ok 16380 1727204188.29342: done checking to see if all hosts have failed 16380 1727204188.29343: getting the next task for host managed-node2 16380 1727204188.29347: done getting next task for host managed-node2 16380 1727204188.29348: ^ task is: None 16380 1727204188.29349: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204188.29403: in VariableManager get_vars() 16380 1727204188.29423: done with get_vars() 16380 1727204188.29430: in VariableManager get_vars() 16380 1727204188.29441: done with get_vars() 16380 1727204188.29454: variable 'omit' from source: magic vars 16380 1727204188.29600: variable 'task' from source: play vars 16380 1727204188.29638: in VariableManager get_vars() 16380 1727204188.29652: done with get_vars() 16380 1727204188.29683: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 16380 1727204188.30012: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204188.30037: getting the remaining hosts for this loop 16380 1727204188.30039: done getting the remaining hosts for this loop 16380 1727204188.30042: getting the next task for host managed-node2 16380 1727204188.30045: done getting next task for host managed-node2 16380 1727204188.30048: ^ task is: TASK: Gathering Facts 16380 1727204188.30050: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204188.30052: getting variables 16380 1727204188.30054: in VariableManager get_vars() 16380 1727204188.30064: Calling all_inventory to load vars for managed-node2 16380 1727204188.30066: Calling groups_inventory to load vars for managed-node2 16380 1727204188.30070: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204188.30076: Calling all_plugins_play to load vars for managed-node2 16380 1727204188.30079: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204188.30083: Calling groups_plugins_play to load vars for managed-node2 16380 1727204188.32324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204188.35310: done with get_vars() 16380 1727204188.35352: done getting variables 16380 1727204188.35409: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:28 -0400 (0:00:00.697) 0:00:49.461 ***** 16380 1727204188.35450: entering _queue_task() for managed-node2/gather_facts 16380 1727204188.35843: worker is 1 (out of 1 available) 16380 1727204188.35857: exiting _queue_task() for managed-node2/gather_facts 16380 1727204188.35879: done queuing things up, now waiting for results queue to drain 16380 1727204188.35881: waiting for pending results... 16380 1727204188.36207: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204188.36216: in run() - task 12b410aa-8751-749c-b6eb-00000000046e 16380 1727204188.36229: variable 'ansible_search_path' from source: unknown 16380 1727204188.36274: calling self._execute() 16380 1727204188.36390: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204188.36405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204188.36425: variable 'omit' from source: magic vars 16380 1727204188.36865: variable 'ansible_distribution_major_version' from source: facts 16380 1727204188.36887: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204188.36903: variable 'omit' from source: magic vars 16380 1727204188.36941: variable 'omit' from source: magic vars 16380 1727204188.36992: variable 'omit' from source: magic vars 16380 1727204188.37048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204188.37096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204188.37126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204188.37154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204188.37171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204188.37210: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204188.37394: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204188.37398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204188.37400: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204188.37402: Set connection var ansible_shell_executable to /bin/sh 16380 1727204188.37405: Set connection var ansible_connection to ssh 16380 1727204188.37407: Set connection var ansible_shell_type to sh 16380 1727204188.37410: Set connection var ansible_pipelining to False 16380 1727204188.37412: Set connection var ansible_timeout to 10 16380 1727204188.37438: variable 'ansible_shell_executable' from source: unknown 16380 1727204188.37447: variable 'ansible_connection' from source: unknown 16380 1727204188.37455: variable 'ansible_module_compression' from source: unknown 16380 1727204188.37462: variable 'ansible_shell_type' from source: unknown 16380 1727204188.37469: variable 'ansible_shell_executable' from source: unknown 16380 1727204188.37476: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204188.37484: variable 'ansible_pipelining' from source: unknown 16380 1727204188.37493: variable 'ansible_timeout' from source: unknown 16380 1727204188.37502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204188.37715: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204188.37738: variable 'omit' from source: magic vars 16380 1727204188.37750: starting attempt loop 16380 1727204188.37757: running the handler 16380 1727204188.37778: variable 'ansible_facts' from source: unknown 16380 1727204188.37807: _low_level_execute_command(): starting 16380 1727204188.37823: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204188.38607: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204188.38685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204188.38704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204188.38782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204188.40597: stdout chunk (state=3): >>>/root <<< 16380 1727204188.40896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204188.40899: stdout chunk (state=3): >>><<< 16380 1727204188.40902: stderr chunk (state=3): >>><<< 16380 1727204188.40906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204188.40908: _low_level_execute_command(): starting 16380 1727204188.40911: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059 `" && echo ansible-tmp-1727204188.408205-19489-90610006531059="` echo /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059 `" ) && sleep 0' 16380 1727204188.41488: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204188.41607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204188.41634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204188.41657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204188.41738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204188.43884: stdout chunk (state=3): >>>ansible-tmp-1727204188.408205-19489-90610006531059=/root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059 <<< 16380 1727204188.44062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204188.44073: stdout chunk (state=3): >>><<< 16380 1727204188.44084: stderr chunk (state=3): >>><<< 16380 1727204188.44109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.408205-19489-90610006531059=/root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204188.44155: variable 'ansible_module_compression' from source: unknown 16380 1727204188.44217: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204188.44298: variable 'ansible_facts' from source: unknown 16380 1727204188.44508: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py 16380 1727204188.44781: Sending initial data 16380 1727204188.44785: Sent initial data (152 bytes) 16380 1727204188.45355: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204188.45370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204188.45385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204188.45414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204188.45541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204188.45553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204188.45575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204188.45660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204188.47357: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 16380 1727204188.47397: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204188.47429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204188.47490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmphi2mbeyf /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py <<< 16380 1727204188.47495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py" <<< 16380 1727204188.47533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmphi2mbeyf" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py" <<< 16380 1727204188.50043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204188.50104: stderr chunk (state=3): >>><<< 16380 1727204188.50117: stdout chunk (state=3): >>><<< 16380 1727204188.50172: done transferring module to remote 16380 1727204188.50238: _low_level_execute_command(): starting 16380 1727204188.50248: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/ /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py && sleep 0' 16380 1727204188.51578: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204188.51607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204188.51625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204188.51843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204188.51898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204188.52020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204188.54098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204188.54161: stdout chunk (state=3): >>><<< 16380 1727204188.54164: stderr chunk (state=3): >>><<< 16380 1727204188.54189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204188.54216: _low_level_execute_command(): starting 16380 1727204188.54231: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/AnsiballZ_setup.py && sleep 0' 16380 1727204188.55975: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204188.55979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204188.56270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204188.56274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204188.56567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204189.25394: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "28", "epoch": "1727204188", "epoch_int": "1727204188", "date": "2024-09-24", "time": "14:56:28", "iso8601_micro": "2024-09-24T18:56:28.887341Z", "iso8601": "2024-09-24T18:56:28Z", "iso8601_basic": "20240924T145628887341", "iso8601_basic_short": "20240924T145628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2842, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 875, "free": 2842}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 692, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147436032, "block_size": 4096, "block_total": 64479564, "block_available": 61315292, "block_used": 3164272, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.7158203125, "5m": 0.57763671875, "15m": 0.36279296875}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204189.27978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204189.27982: stdout chunk (state=3): >>><<< 16380 1727204189.27984: stderr chunk (state=3): >>><<< 16380 1727204189.27988: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "28", "epoch": "1727204188", "epoch_int": "1727204188", "date": "2024-09-24", "time": "14:56:28", "iso8601_micro": "2024-09-24T18:56:28.887341Z", "iso8601": "2024-09-24T18:56:28Z", "iso8601_basic": "20240924T145628887341", "iso8601_basic_short": "20240924T145628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2842, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 875, "free": 2842}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 692, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147436032, "block_size": 4096, "block_total": 64479564, "block_available": 61315292, "block_used": 3164272, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "", "ansible_loadavg": {"1m": 0.7158203125, "5m": 0.57763671875, "15m": 0.36279296875}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204189.28769: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204189.29032: _low_level_execute_command(): starting 16380 1727204189.29139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.408205-19489-90610006531059/ > /dev/null 2>&1 && sleep 0' 16380 1727204189.30368: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204189.30486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204189.30737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204189.30774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204189.32831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204189.32842: stdout chunk (state=3): >>><<< 16380 1727204189.32854: stderr chunk (state=3): >>><<< 16380 1727204189.32874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204189.32903: handler run complete 16380 1727204189.33195: variable 'ansible_facts' from source: unknown 16380 1727204189.33529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.34579: variable 'ansible_facts' from source: unknown 16380 1727204189.35022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.35681: attempt loop complete, returning result 16380 1727204189.35740: _execute() done 16380 1727204189.35763: dumping result to json 16380 1727204189.36093: done dumping result, returning 16380 1727204189.36097: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-00000000046e] 16380 1727204189.36099: sending task result for task 12b410aa-8751-749c-b6eb-00000000046e 16380 1727204189.36695: done sending task result for task 12b410aa-8751-749c-b6eb-00000000046e 16380 1727204189.36699: WORKER PROCESS EXITING ok: [managed-node2] 16380 1727204189.37647: no more pending results, returning what we have 16380 1727204189.37650: results queue empty 16380 1727204189.37652: checking for any_errors_fatal 16380 1727204189.37653: done checking for any_errors_fatal 16380 1727204189.37654: checking for max_fail_percentage 16380 1727204189.37656: done checking for max_fail_percentage 16380 1727204189.37657: checking to see if all hosts have failed and the running result is not ok 16380 1727204189.37658: done checking to see if all hosts have failed 16380 1727204189.37659: getting the remaining hosts for this loop 16380 1727204189.37661: done getting the remaining hosts for this loop 16380 1727204189.37665: getting the next task for host managed-node2 16380 1727204189.37671: done getting next task for host managed-node2 16380 1727204189.37674: ^ task is: TASK: meta (flush_handlers) 16380 1727204189.37676: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204189.37681: getting variables 16380 1727204189.37682: in VariableManager get_vars() 16380 1727204189.37743: Calling all_inventory to load vars for managed-node2 16380 1727204189.37747: Calling groups_inventory to load vars for managed-node2 16380 1727204189.37751: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204189.37839: Calling all_plugins_play to load vars for managed-node2 16380 1727204189.37846: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204189.37851: Calling groups_plugins_play to load vars for managed-node2 16380 1727204189.42283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.47256: done with get_vars() 16380 1727204189.47371: done getting variables 16380 1727204189.47472: in VariableManager get_vars() 16380 1727204189.47485: Calling all_inventory to load vars for managed-node2 16380 1727204189.47488: Calling groups_inventory to load vars for managed-node2 16380 1727204189.47494: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204189.47507: Calling all_plugins_play to load vars for managed-node2 16380 1727204189.47510: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204189.47515: Calling groups_plugins_play to load vars for managed-node2 16380 1727204189.50277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.55406: done with get_vars() 16380 1727204189.55459: done queuing things up, now waiting for results queue to drain 16380 1727204189.55468: results queue empty 16380 1727204189.55469: checking for any_errors_fatal 16380 1727204189.55476: done checking for any_errors_fatal 16380 1727204189.55477: checking for max_fail_percentage 16380 1727204189.55478: done checking for max_fail_percentage 16380 1727204189.55479: checking to see if all hosts have failed and the running result is not ok 16380 1727204189.55480: done checking to see if all hosts have failed 16380 1727204189.55481: getting the remaining hosts for this loop 16380 1727204189.55488: done getting the remaining hosts for this loop 16380 1727204189.55494: getting the next task for host managed-node2 16380 1727204189.55504: done getting next task for host managed-node2 16380 1727204189.55508: ^ task is: TASK: Include the task '{{ task }}' 16380 1727204189.55510: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204189.55513: getting variables 16380 1727204189.55515: in VariableManager get_vars() 16380 1727204189.55529: Calling all_inventory to load vars for managed-node2 16380 1727204189.55531: Calling groups_inventory to load vars for managed-node2 16380 1727204189.55535: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204189.55542: Calling all_plugins_play to load vars for managed-node2 16380 1727204189.55546: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204189.55551: Calling groups_plugins_play to load vars for managed-node2 16380 1727204189.58328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.62085: done with get_vars() 16380 1727204189.62130: done getting variables 16380 1727204189.62459: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:29 -0400 (0:00:01.270) 0:00:50.731 ***** 16380 1727204189.62529: entering _queue_task() for managed-node2/include_tasks 16380 1727204189.63199: worker is 1 (out of 1 available) 16380 1727204189.63212: exiting _queue_task() for managed-node2/include_tasks 16380 1727204189.63224: done queuing things up, now waiting for results queue to drain 16380 1727204189.63226: waiting for pending results... 16380 1727204189.63392: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_absent.yml' 16380 1727204189.63514: in run() - task 12b410aa-8751-749c-b6eb-000000000073 16380 1727204189.63532: variable 'ansible_search_path' from source: unknown 16380 1727204189.63580: calling self._execute() 16380 1727204189.63695: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204189.63703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204189.63720: variable 'omit' from source: magic vars 16380 1727204189.64390: variable 'ansible_distribution_major_version' from source: facts 16380 1727204189.64400: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204189.64407: variable 'task' from source: play vars 16380 1727204189.64527: variable 'task' from source: play vars 16380 1727204189.64535: _execute() done 16380 1727204189.64562: dumping result to json 16380 1727204189.64565: done dumping result, returning 16380 1727204189.64594: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_absent.yml' [12b410aa-8751-749c-b6eb-000000000073] 16380 1727204189.64597: sending task result for task 12b410aa-8751-749c-b6eb-000000000073 16380 1727204189.64688: done sending task result for task 12b410aa-8751-749c-b6eb-000000000073 16380 1727204189.64693: WORKER PROCESS EXITING 16380 1727204189.64732: no more pending results, returning what we have 16380 1727204189.64738: in VariableManager get_vars() 16380 1727204189.64787: Calling all_inventory to load vars for managed-node2 16380 1727204189.64792: Calling groups_inventory to load vars for managed-node2 16380 1727204189.64797: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204189.64814: Calling all_plugins_play to load vars for managed-node2 16380 1727204189.64818: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204189.64823: Calling groups_plugins_play to load vars for managed-node2 16380 1727204189.68366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.72856: done with get_vars() 16380 1727204189.72904: variable 'ansible_search_path' from source: unknown 16380 1727204189.72933: we have included files to process 16380 1727204189.72935: generating all_blocks data 16380 1727204189.72936: done generating all_blocks data 16380 1727204189.72937: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 16380 1727204189.72939: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 16380 1727204189.72941: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 16380 1727204189.73237: in VariableManager get_vars() 16380 1727204189.73263: done with get_vars() 16380 1727204189.73436: done processing included file 16380 1727204189.73439: iterating over new_blocks loaded from include file 16380 1727204189.73440: in VariableManager get_vars() 16380 1727204189.73455: done with get_vars() 16380 1727204189.73456: filtering new block on tags 16380 1727204189.73479: done filtering new block on tags 16380 1727204189.73482: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 16380 1727204189.73531: extending task lists for all hosts with included blocks 16380 1727204189.73603: done extending task lists 16380 1727204189.73605: done processing included files 16380 1727204189.73606: results queue empty 16380 1727204189.73607: checking for any_errors_fatal 16380 1727204189.73609: done checking for any_errors_fatal 16380 1727204189.73610: checking for max_fail_percentage 16380 1727204189.73611: done checking for max_fail_percentage 16380 1727204189.73612: checking to see if all hosts have failed and the running result is not ok 16380 1727204189.73613: done checking to see if all hosts have failed 16380 1727204189.73614: getting the remaining hosts for this loop 16380 1727204189.73616: done getting the remaining hosts for this loop 16380 1727204189.73619: getting the next task for host managed-node2 16380 1727204189.73624: done getting next task for host managed-node2 16380 1727204189.73669: ^ task is: TASK: Include the task 'get_profile_stat.yml' 16380 1727204189.73673: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204189.73676: getting variables 16380 1727204189.73677: in VariableManager get_vars() 16380 1727204189.73688: Calling all_inventory to load vars for managed-node2 16380 1727204189.73693: Calling groups_inventory to load vars for managed-node2 16380 1727204189.73696: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204189.73703: Calling all_plugins_play to load vars for managed-node2 16380 1727204189.73707: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204189.73711: Calling groups_plugins_play to load vars for managed-node2 16380 1727204189.83000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204189.86371: done with get_vars() 16380 1727204189.86474: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.241) 0:00:50.973 ***** 16380 1727204189.86697: entering _queue_task() for managed-node2/include_tasks 16380 1727204189.87518: worker is 1 (out of 1 available) 16380 1727204189.87530: exiting _queue_task() for managed-node2/include_tasks 16380 1727204189.87542: done queuing things up, now waiting for results queue to drain 16380 1727204189.87544: waiting for pending results... 16380 1727204189.88064: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 16380 1727204189.88070: in run() - task 12b410aa-8751-749c-b6eb-00000000047f 16380 1727204189.88073: variable 'ansible_search_path' from source: unknown 16380 1727204189.88327: variable 'ansible_search_path' from source: unknown 16380 1727204189.88366: calling self._execute() 16380 1727204189.88886: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204189.88904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204189.88909: variable 'omit' from source: magic vars 16380 1727204189.90345: variable 'ansible_distribution_major_version' from source: facts 16380 1727204189.90360: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204189.90368: _execute() done 16380 1727204189.90371: dumping result to json 16380 1727204189.90377: done dumping result, returning 16380 1727204189.90385: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-749c-b6eb-00000000047f] 16380 1727204189.90393: sending task result for task 12b410aa-8751-749c-b6eb-00000000047f 16380 1727204189.90507: done sending task result for task 12b410aa-8751-749c-b6eb-00000000047f 16380 1727204189.90511: WORKER PROCESS EXITING 16380 1727204189.90586: no more pending results, returning what we have 16380 1727204189.90595: in VariableManager get_vars() 16380 1727204189.90637: Calling all_inventory to load vars for managed-node2 16380 1727204189.90641: Calling groups_inventory to load vars for managed-node2 16380 1727204189.90646: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204189.90670: Calling all_plugins_play to load vars for managed-node2 16380 1727204189.90674: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204189.90679: Calling groups_plugins_play to load vars for managed-node2 16380 1727204189.94768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204190.01932: done with get_vars() 16380 1727204190.02087: variable 'ansible_search_path' from source: unknown 16380 1727204190.02174: variable 'ansible_search_path' from source: unknown 16380 1727204190.02188: variable 'task' from source: play vars 16380 1727204190.02433: variable 'task' from source: play vars 16380 1727204190.02475: we have included files to process 16380 1727204190.02477: generating all_blocks data 16380 1727204190.02479: done generating all_blocks data 16380 1727204190.02481: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16380 1727204190.02482: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16380 1727204190.02486: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 16380 1727204190.05071: done processing included file 16380 1727204190.05073: iterating over new_blocks loaded from include file 16380 1727204190.05075: in VariableManager get_vars() 16380 1727204190.05300: done with get_vars() 16380 1727204190.05303: filtering new block on tags 16380 1727204190.05340: done filtering new block on tags 16380 1727204190.05344: in VariableManager get_vars() 16380 1727204190.05362: done with get_vars() 16380 1727204190.05364: filtering new block on tags 16380 1727204190.05398: done filtering new block on tags 16380 1727204190.05401: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 16380 1727204190.05407: extending task lists for all hosts with included blocks 16380 1727204190.05650: done extending task lists 16380 1727204190.05652: done processing included files 16380 1727204190.05653: results queue empty 16380 1727204190.05654: checking for any_errors_fatal 16380 1727204190.05659: done checking for any_errors_fatal 16380 1727204190.05660: checking for max_fail_percentage 16380 1727204190.05661: done checking for max_fail_percentage 16380 1727204190.05662: checking to see if all hosts have failed and the running result is not ok 16380 1727204190.05663: done checking to see if all hosts have failed 16380 1727204190.05664: getting the remaining hosts for this loop 16380 1727204190.05665: done getting the remaining hosts for this loop 16380 1727204190.05668: getting the next task for host managed-node2 16380 1727204190.05674: done getting next task for host managed-node2 16380 1727204190.05676: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 16380 1727204190.05680: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204190.05682: getting variables 16380 1727204190.05684: in VariableManager get_vars() 16380 1727204190.05798: Calling all_inventory to load vars for managed-node2 16380 1727204190.05802: Calling groups_inventory to load vars for managed-node2 16380 1727204190.05805: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204190.05812: Calling all_plugins_play to load vars for managed-node2 16380 1727204190.05815: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204190.05819: Calling groups_plugins_play to load vars for managed-node2 16380 1727204190.11316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204190.16046: done with get_vars() 16380 1727204190.16088: done getting variables 16380 1727204190.16150: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.294) 0:00:51.268 ***** 16380 1727204190.16198: entering _queue_task() for managed-node2/set_fact 16380 1727204190.16801: worker is 1 (out of 1 available) 16380 1727204190.16814: exiting _queue_task() for managed-node2/set_fact 16380 1727204190.16830: done queuing things up, now waiting for results queue to drain 16380 1727204190.16833: waiting for pending results... 16380 1727204190.17114: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 16380 1727204190.17122: in run() - task 12b410aa-8751-749c-b6eb-00000000048a 16380 1727204190.17147: variable 'ansible_search_path' from source: unknown 16380 1727204190.17157: variable 'ansible_search_path' from source: unknown 16380 1727204190.17214: calling self._execute() 16380 1727204190.17339: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.17352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.17368: variable 'omit' from source: magic vars 16380 1727204190.17865: variable 'ansible_distribution_major_version' from source: facts 16380 1727204190.17868: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204190.17871: variable 'omit' from source: magic vars 16380 1727204190.17939: variable 'omit' from source: magic vars 16380 1727204190.18001: variable 'omit' from source: magic vars 16380 1727204190.18054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204190.18194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204190.18199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204190.18201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204190.18204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204190.18232: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204190.18243: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.18252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.18404: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204190.18432: Set connection var ansible_shell_executable to /bin/sh 16380 1727204190.18448: Set connection var ansible_connection to ssh 16380 1727204190.18462: Set connection var ansible_shell_type to sh 16380 1727204190.18474: Set connection var ansible_pipelining to False 16380 1727204190.18492: Set connection var ansible_timeout to 10 16380 1727204190.18536: variable 'ansible_shell_executable' from source: unknown 16380 1727204190.18544: variable 'ansible_connection' from source: unknown 16380 1727204190.18629: variable 'ansible_module_compression' from source: unknown 16380 1727204190.18637: variable 'ansible_shell_type' from source: unknown 16380 1727204190.18641: variable 'ansible_shell_executable' from source: unknown 16380 1727204190.18643: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.18646: variable 'ansible_pipelining' from source: unknown 16380 1727204190.18648: variable 'ansible_timeout' from source: unknown 16380 1727204190.18650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.18787: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204190.18809: variable 'omit' from source: magic vars 16380 1727204190.18824: starting attempt loop 16380 1727204190.18832: running the handler 16380 1727204190.18860: handler run complete 16380 1727204190.18877: attempt loop complete, returning result 16380 1727204190.18884: _execute() done 16380 1727204190.18893: dumping result to json 16380 1727204190.18902: done dumping result, returning 16380 1727204190.18913: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-749c-b6eb-00000000048a] 16380 1727204190.18927: sending task result for task 12b410aa-8751-749c-b6eb-00000000048a 16380 1727204190.19140: done sending task result for task 12b410aa-8751-749c-b6eb-00000000048a 16380 1727204190.19143: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 16380 1727204190.19234: no more pending results, returning what we have 16380 1727204190.19238: results queue empty 16380 1727204190.19239: checking for any_errors_fatal 16380 1727204190.19242: done checking for any_errors_fatal 16380 1727204190.19242: checking for max_fail_percentage 16380 1727204190.19244: done checking for max_fail_percentage 16380 1727204190.19245: checking to see if all hosts have failed and the running result is not ok 16380 1727204190.19246: done checking to see if all hosts have failed 16380 1727204190.19247: getting the remaining hosts for this loop 16380 1727204190.19249: done getting the remaining hosts for this loop 16380 1727204190.19254: getting the next task for host managed-node2 16380 1727204190.19264: done getting next task for host managed-node2 16380 1727204190.19269: ^ task is: TASK: Stat profile file 16380 1727204190.19274: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204190.19280: getting variables 16380 1727204190.19283: in VariableManager get_vars() 16380 1727204190.19324: Calling all_inventory to load vars for managed-node2 16380 1727204190.19329: Calling groups_inventory to load vars for managed-node2 16380 1727204190.19334: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204190.19347: Calling all_plugins_play to load vars for managed-node2 16380 1727204190.19351: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204190.19354: Calling groups_plugins_play to load vars for managed-node2 16380 1727204190.21961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204190.26459: done with get_vars() 16380 1727204190.26629: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.106) 0:00:51.374 ***** 16380 1727204190.26822: entering _queue_task() for managed-node2/stat 16380 1727204190.27632: worker is 1 (out of 1 available) 16380 1727204190.27645: exiting _queue_task() for managed-node2/stat 16380 1727204190.27659: done queuing things up, now waiting for results queue to drain 16380 1727204190.27663: waiting for pending results... 16380 1727204190.28124: running TaskExecutor() for managed-node2/TASK: Stat profile file 16380 1727204190.28198: in run() - task 12b410aa-8751-749c-b6eb-00000000048b 16380 1727204190.28203: variable 'ansible_search_path' from source: unknown 16380 1727204190.28206: variable 'ansible_search_path' from source: unknown 16380 1727204190.28223: calling self._execute() 16380 1727204190.28339: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.28352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.28369: variable 'omit' from source: magic vars 16380 1727204190.28984: variable 'ansible_distribution_major_version' from source: facts 16380 1727204190.28987: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204190.28992: variable 'omit' from source: magic vars 16380 1727204190.28999: variable 'omit' from source: magic vars 16380 1727204190.29120: variable 'profile' from source: play vars 16380 1727204190.29125: variable 'interface' from source: set_fact 16380 1727204190.29211: variable 'interface' from source: set_fact 16380 1727204190.29237: variable 'omit' from source: magic vars 16380 1727204190.29373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204190.29377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204190.29380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204190.29391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204190.29405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204190.29444: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204190.29448: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.29453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.29595: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204190.29801: Set connection var ansible_shell_executable to /bin/sh 16380 1727204190.29812: Set connection var ansible_connection to ssh 16380 1727204190.29816: Set connection var ansible_shell_type to sh 16380 1727204190.29835: Set connection var ansible_pipelining to False 16380 1727204190.29851: Set connection var ansible_timeout to 10 16380 1727204190.29880: variable 'ansible_shell_executable' from source: unknown 16380 1727204190.29884: variable 'ansible_connection' from source: unknown 16380 1727204190.29888: variable 'ansible_module_compression' from source: unknown 16380 1727204190.29892: variable 'ansible_shell_type' from source: unknown 16380 1727204190.29894: variable 'ansible_shell_executable' from source: unknown 16380 1727204190.30100: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.30104: variable 'ansible_pipelining' from source: unknown 16380 1727204190.30107: variable 'ansible_timeout' from source: unknown 16380 1727204190.30110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.30328: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204190.30341: variable 'omit' from source: magic vars 16380 1727204190.30428: starting attempt loop 16380 1727204190.30434: running the handler 16380 1727204190.30438: _low_level_execute_command(): starting 16380 1727204190.30440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204190.31504: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.31607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204190.31612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204190.31815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.31988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.33834: stdout chunk (state=3): >>>/root <<< 16380 1727204190.33961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204190.33968: stdout chunk (state=3): >>><<< 16380 1727204190.33979: stderr chunk (state=3): >>><<< 16380 1727204190.34009: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204190.34094: _low_level_execute_command(): starting 16380 1727204190.34098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420 `" && echo ansible-tmp-1727204190.340101-19624-46437060700420="` echo /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420 `" ) && sleep 0' 16380 1727204190.35512: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204190.35522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.35698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.35731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204190.35867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.35947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.38063: stdout chunk (state=3): >>>ansible-tmp-1727204190.340101-19624-46437060700420=/root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420 <<< 16380 1727204190.38275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204190.38294: stdout chunk (state=3): >>><<< 16380 1727204190.38402: stderr chunk (state=3): >>><<< 16380 1727204190.38706: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.340101-19624-46437060700420=/root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204190.38710: variable 'ansible_module_compression' from source: unknown 16380 1727204190.38713: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16380 1727204190.38855: variable 'ansible_facts' from source: unknown 16380 1727204190.39242: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py 16380 1727204190.39693: Sending initial data 16380 1727204190.39804: Sent initial data (151 bytes) 16380 1727204190.40560: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204190.40752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204190.40884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.40925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.42683: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204190.42688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204190.42741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp4dkrdz9l /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py <<< 16380 1727204190.42745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py" <<< 16380 1727204190.42829: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 16380 1727204190.42845: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp4dkrdz9l" to remote "/root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py" <<< 16380 1727204190.45033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204190.45036: stdout chunk (state=3): >>><<< 16380 1727204190.45038: stderr chunk (state=3): >>><<< 16380 1727204190.45068: done transferring module to remote 16380 1727204190.45072: _low_level_execute_command(): starting 16380 1727204190.45202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/ /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py && sleep 0' 16380 1727204190.46543: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204190.46613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.46744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.55119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204190.55169: stderr chunk (state=3): >>><<< 16380 1727204190.55179: stdout chunk (state=3): >>><<< 16380 1727204190.55211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204190.55339: _low_level_execute_command(): starting 16380 1727204190.55461: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/AnsiballZ_stat.py && sleep 0' 16380 1727204190.56265: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204190.56269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204190.56290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204190.56299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.56316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204190.56326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204190.56388: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.56453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.56503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.74241: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16380 1727204190.75718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204190.75770: stderr chunk (state=3): >>><<< 16380 1727204190.75773: stdout chunk (state=3): >>><<< 16380 1727204190.75792: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204190.75825: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204190.75835: _low_level_execute_command(): starting 16380 1727204190.75841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.340101-19624-46437060700420/ > /dev/null 2>&1 && sleep 0' 16380 1727204190.76271: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204190.76286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204190.76294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.76320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204190.76324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204190.76326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.76379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204190.76383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.76427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.78475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204190.78501: stderr chunk (state=3): >>><<< 16380 1727204190.78521: stdout chunk (state=3): >>><<< 16380 1727204190.78696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204190.78700: handler run complete 16380 1727204190.78703: attempt loop complete, returning result 16380 1727204190.78709: _execute() done 16380 1727204190.78711: dumping result to json 16380 1727204190.78713: done dumping result, returning 16380 1727204190.78715: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-749c-b6eb-00000000048b] 16380 1727204190.78717: sending task result for task 12b410aa-8751-749c-b6eb-00000000048b 16380 1727204190.78928: done sending task result for task 12b410aa-8751-749c-b6eb-00000000048b 16380 1727204190.78932: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16380 1727204190.79021: no more pending results, returning what we have 16380 1727204190.79025: results queue empty 16380 1727204190.79026: checking for any_errors_fatal 16380 1727204190.79035: done checking for any_errors_fatal 16380 1727204190.79036: checking for max_fail_percentage 16380 1727204190.79038: done checking for max_fail_percentage 16380 1727204190.79039: checking to see if all hosts have failed and the running result is not ok 16380 1727204190.79040: done checking to see if all hosts have failed 16380 1727204190.79041: getting the remaining hosts for this loop 16380 1727204190.79043: done getting the remaining hosts for this loop 16380 1727204190.79120: getting the next task for host managed-node2 16380 1727204190.79128: done getting next task for host managed-node2 16380 1727204190.79131: ^ task is: TASK: Set NM profile exist flag based on the profile files 16380 1727204190.79135: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204190.79139: getting variables 16380 1727204190.79140: in VariableManager get_vars() 16380 1727204190.79171: Calling all_inventory to load vars for managed-node2 16380 1727204190.79175: Calling groups_inventory to load vars for managed-node2 16380 1727204190.79179: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204190.79195: Calling all_plugins_play to load vars for managed-node2 16380 1727204190.79199: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204190.79204: Calling groups_plugins_play to load vars for managed-node2 16380 1727204190.82242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204190.85005: done with get_vars() 16380 1727204190.85031: done getting variables 16380 1727204190.85095: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.583) 0:00:51.957 ***** 16380 1727204190.85134: entering _queue_task() for managed-node2/set_fact 16380 1727204190.85443: worker is 1 (out of 1 available) 16380 1727204190.85459: exiting _queue_task() for managed-node2/set_fact 16380 1727204190.85476: done queuing things up, now waiting for results queue to drain 16380 1727204190.85479: waiting for pending results... 16380 1727204190.85676: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 16380 1727204190.85772: in run() - task 12b410aa-8751-749c-b6eb-00000000048c 16380 1727204190.85791: variable 'ansible_search_path' from source: unknown 16380 1727204190.85795: variable 'ansible_search_path' from source: unknown 16380 1727204190.85830: calling self._execute() 16380 1727204190.86015: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.86022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.86026: variable 'omit' from source: magic vars 16380 1727204190.86597: variable 'ansible_distribution_major_version' from source: facts 16380 1727204190.86626: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204190.86772: variable 'profile_stat' from source: set_fact 16380 1727204190.86787: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204190.86793: when evaluation is False, skipping this task 16380 1727204190.86904: _execute() done 16380 1727204190.86913: dumping result to json 16380 1727204190.86926: done dumping result, returning 16380 1727204190.86936: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-749c-b6eb-00000000048c] 16380 1727204190.86956: sending task result for task 12b410aa-8751-749c-b6eb-00000000048c skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204190.87237: no more pending results, returning what we have 16380 1727204190.87241: results queue empty 16380 1727204190.87243: checking for any_errors_fatal 16380 1727204190.87254: done checking for any_errors_fatal 16380 1727204190.87255: checking for max_fail_percentage 16380 1727204190.87257: done checking for max_fail_percentage 16380 1727204190.87257: checking to see if all hosts have failed and the running result is not ok 16380 1727204190.87258: done checking to see if all hosts have failed 16380 1727204190.87259: getting the remaining hosts for this loop 16380 1727204190.87261: done getting the remaining hosts for this loop 16380 1727204190.87265: getting the next task for host managed-node2 16380 1727204190.87278: done getting next task for host managed-node2 16380 1727204190.87281: ^ task is: TASK: Get NM profile info 16380 1727204190.87291: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204190.87297: getting variables 16380 1727204190.87299: in VariableManager get_vars() 16380 1727204190.87331: Calling all_inventory to load vars for managed-node2 16380 1727204190.87334: Calling groups_inventory to load vars for managed-node2 16380 1727204190.87338: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204190.87353: done sending task result for task 12b410aa-8751-749c-b6eb-00000000048c 16380 1727204190.87356: WORKER PROCESS EXITING 16380 1727204190.87366: Calling all_plugins_play to load vars for managed-node2 16380 1727204190.87370: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204190.87374: Calling groups_plugins_play to load vars for managed-node2 16380 1727204190.88640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204190.91338: done with get_vars() 16380 1727204190.91381: done getting variables 16380 1727204190.91517: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.064) 0:00:52.022 ***** 16380 1727204190.91575: entering _queue_task() for managed-node2/shell 16380 1727204190.91961: worker is 1 (out of 1 available) 16380 1727204190.91986: exiting _queue_task() for managed-node2/shell 16380 1727204190.92209: done queuing things up, now waiting for results queue to drain 16380 1727204190.92213: waiting for pending results... 16380 1727204190.92407: running TaskExecutor() for managed-node2/TASK: Get NM profile info 16380 1727204190.92576: in run() - task 12b410aa-8751-749c-b6eb-00000000048d 16380 1727204190.92731: variable 'ansible_search_path' from source: unknown 16380 1727204190.92736: variable 'ansible_search_path' from source: unknown 16380 1727204190.92740: calling self._execute() 16380 1727204190.92813: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.92816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.92824: variable 'omit' from source: magic vars 16380 1727204190.93626: variable 'ansible_distribution_major_version' from source: facts 16380 1727204190.93632: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204190.93635: variable 'omit' from source: magic vars 16380 1727204190.93638: variable 'omit' from source: magic vars 16380 1727204190.93640: variable 'profile' from source: play vars 16380 1727204190.93644: variable 'interface' from source: set_fact 16380 1727204190.93666: variable 'interface' from source: set_fact 16380 1727204190.93692: variable 'omit' from source: magic vars 16380 1727204190.93744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204190.93811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204190.93835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204190.93848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204190.93930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204190.93935: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204190.93940: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.93943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.94060: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204190.94068: Set connection var ansible_shell_executable to /bin/sh 16380 1727204190.94260: Set connection var ansible_connection to ssh 16380 1727204190.94264: Set connection var ansible_shell_type to sh 16380 1727204190.94267: Set connection var ansible_pipelining to False 16380 1727204190.94273: Set connection var ansible_timeout to 10 16380 1727204190.94275: variable 'ansible_shell_executable' from source: unknown 16380 1727204190.94278: variable 'ansible_connection' from source: unknown 16380 1727204190.94281: variable 'ansible_module_compression' from source: unknown 16380 1727204190.94283: variable 'ansible_shell_type' from source: unknown 16380 1727204190.94286: variable 'ansible_shell_executable' from source: unknown 16380 1727204190.94291: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204190.94294: variable 'ansible_pipelining' from source: unknown 16380 1727204190.94296: variable 'ansible_timeout' from source: unknown 16380 1727204190.94299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204190.94573: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204190.94577: variable 'omit' from source: magic vars 16380 1727204190.94580: starting attempt loop 16380 1727204190.94583: running the handler 16380 1727204190.94587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204190.94590: _low_level_execute_command(): starting 16380 1727204190.94592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204190.95311: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.95315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204190.95320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204190.95354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.95632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204190.97191: stdout chunk (state=3): >>>/root <<< 16380 1727204190.97494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204190.97498: stderr chunk (state=3): >>><<< 16380 1727204190.97501: stdout chunk (state=3): >>><<< 16380 1727204190.97505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204190.97507: _low_level_execute_command(): starting 16380 1727204190.97511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537 `" && echo ansible-tmp-1727204190.9741116-19757-79644375322537="` echo /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537 `" ) && sleep 0' 16380 1727204190.98123: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204190.98127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204190.98154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204190.98185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204190.98313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204190.98316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204190.98352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204190.98367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204190.98482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204191.00569: stdout chunk (state=3): >>>ansible-tmp-1727204190.9741116-19757-79644375322537=/root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537 <<< 16380 1727204191.00809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204191.00812: stdout chunk (state=3): >>><<< 16380 1727204191.00815: stderr chunk (state=3): >>><<< 16380 1727204191.00846: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.9741116-19757-79644375322537=/root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204191.01024: variable 'ansible_module_compression' from source: unknown 16380 1727204191.01027: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16380 1727204191.01069: variable 'ansible_facts' from source: unknown 16380 1727204191.01202: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py 16380 1727204191.01385: Sending initial data 16380 1727204191.01400: Sent initial data (155 bytes) 16380 1727204191.01829: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204191.01835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204191.01865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204191.01868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204191.01872: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204191.01944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204191.01948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204191.02004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204191.03731: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204191.03774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204191.03827: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpuhefkad6 /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py <<< 16380 1727204191.03831: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py" <<< 16380 1727204191.03862: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpuhefkad6" to remote "/root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py" <<< 16380 1727204191.05326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204191.05428: stderr chunk (state=3): >>><<< 16380 1727204191.05433: stdout chunk (state=3): >>><<< 16380 1727204191.05435: done transferring module to remote 16380 1727204191.05437: _low_level_execute_command(): starting 16380 1727204191.05440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/ /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py && sleep 0' 16380 1727204191.06114: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204191.06161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204191.06307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204191.06345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204191.06348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204191.06388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204191.06520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204191.08379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204191.08475: stderr chunk (state=3): >>><<< 16380 1727204191.08478: stdout chunk (state=3): >>><<< 16380 1727204191.08591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204191.08595: _low_level_execute_command(): starting 16380 1727204191.08598: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/AnsiballZ_command.py && sleep 0' 16380 1727204191.09183: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204191.09201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204191.09214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204191.09241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204191.09267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204191.09357: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204191.09395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204191.09412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204191.09433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204191.09514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204191.29099: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:31.271914", "end": "2024-09-24 14:56:31.290123", "delta": "0:00:00.018209", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16380 1727204191.31068: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 16380 1727204191.31072: stdout chunk (state=3): >>><<< 16380 1727204191.31075: stderr chunk (state=3): >>><<< 16380 1727204191.31078: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:31.271914", "end": "2024-09-24 14:56:31.290123", "delta": "0:00:00.018209", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 16380 1727204191.31081: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204191.31083: _low_level_execute_command(): starting 16380 1727204191.31085: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.9741116-19757-79644375322537/ > /dev/null 2>&1 && sleep 0' 16380 1727204191.31656: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204191.31667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204191.31679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204191.31698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204191.31730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204191.31807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204191.31848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204191.31897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204191.31906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204191.31947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204191.34198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204191.34201: stderr chunk (state=3): >>><<< 16380 1727204191.34204: stdout chunk (state=3): >>><<< 16380 1727204191.34206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204191.34208: handler run complete 16380 1727204191.34210: Evaluated conditional (False): False 16380 1727204191.34211: attempt loop complete, returning result 16380 1727204191.34213: _execute() done 16380 1727204191.34215: dumping result to json 16380 1727204191.34216: done dumping result, returning 16380 1727204191.34220: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-749c-b6eb-00000000048d] 16380 1727204191.34222: sending task result for task 12b410aa-8751-749c-b6eb-00000000048d 16380 1727204191.34296: done sending task result for task 12b410aa-8751-749c-b6eb-00000000048d 16380 1727204191.34299: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.018209", "end": "2024-09-24 14:56:31.290123", "rc": 1, "start": "2024-09-24 14:56:31.271914" } MSG: non-zero return code ...ignoring 16380 1727204191.34401: no more pending results, returning what we have 16380 1727204191.34405: results queue empty 16380 1727204191.34406: checking for any_errors_fatal 16380 1727204191.34417: done checking for any_errors_fatal 16380 1727204191.34418: checking for max_fail_percentage 16380 1727204191.34420: done checking for max_fail_percentage 16380 1727204191.34421: checking to see if all hosts have failed and the running result is not ok 16380 1727204191.34422: done checking to see if all hosts have failed 16380 1727204191.34423: getting the remaining hosts for this loop 16380 1727204191.34425: done getting the remaining hosts for this loop 16380 1727204191.34430: getting the next task for host managed-node2 16380 1727204191.34439: done getting next task for host managed-node2 16380 1727204191.34442: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16380 1727204191.34447: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204191.34452: getting variables 16380 1727204191.34454: in VariableManager get_vars() 16380 1727204191.34487: Calling all_inventory to load vars for managed-node2 16380 1727204191.34594: Calling groups_inventory to load vars for managed-node2 16380 1727204191.34599: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204191.34612: Calling all_plugins_play to load vars for managed-node2 16380 1727204191.34615: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204191.34619: Calling groups_plugins_play to load vars for managed-node2 16380 1727204191.36997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204191.40081: done with get_vars() 16380 1727204191.40128: done getting variables 16380 1727204191.40212: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.486) 0:00:52.509 ***** 16380 1727204191.40253: entering _queue_task() for managed-node2/set_fact 16380 1727204191.40663: worker is 1 (out of 1 available) 16380 1727204191.40678: exiting _queue_task() for managed-node2/set_fact 16380 1727204191.40697: done queuing things up, now waiting for results queue to drain 16380 1727204191.40700: waiting for pending results... 16380 1727204191.41030: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 16380 1727204191.41187: in run() - task 12b410aa-8751-749c-b6eb-00000000048e 16380 1727204191.41500: variable 'ansible_search_path' from source: unknown 16380 1727204191.41504: variable 'ansible_search_path' from source: unknown 16380 1727204191.41508: calling self._execute() 16380 1727204191.41585: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204191.41799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204191.41803: variable 'omit' from source: magic vars 16380 1727204191.42629: variable 'ansible_distribution_major_version' from source: facts 16380 1727204191.42716: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204191.43073: variable 'nm_profile_exists' from source: set_fact 16380 1727204191.43158: Evaluated conditional (nm_profile_exists.rc == 0): False 16380 1727204191.43203: when evaluation is False, skipping this task 16380 1727204191.43212: _execute() done 16380 1727204191.43226: dumping result to json 16380 1727204191.43253: done dumping result, returning 16380 1727204191.43268: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-749c-b6eb-00000000048e] 16380 1727204191.43316: sending task result for task 12b410aa-8751-749c-b6eb-00000000048e skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 16380 1727204191.43578: no more pending results, returning what we have 16380 1727204191.43583: results queue empty 16380 1727204191.43584: checking for any_errors_fatal 16380 1727204191.43601: done checking for any_errors_fatal 16380 1727204191.43602: checking for max_fail_percentage 16380 1727204191.43604: done checking for max_fail_percentage 16380 1727204191.43605: checking to see if all hosts have failed and the running result is not ok 16380 1727204191.43606: done checking to see if all hosts have failed 16380 1727204191.43607: getting the remaining hosts for this loop 16380 1727204191.43609: done getting the remaining hosts for this loop 16380 1727204191.43616: getting the next task for host managed-node2 16380 1727204191.43631: done getting next task for host managed-node2 16380 1727204191.43636: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 16380 1727204191.43641: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204191.43646: getting variables 16380 1727204191.43648: in VariableManager get_vars() 16380 1727204191.43683: Calling all_inventory to load vars for managed-node2 16380 1727204191.43687: Calling groups_inventory to load vars for managed-node2 16380 1727204191.43999: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204191.44016: Calling all_plugins_play to load vars for managed-node2 16380 1727204191.44024: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204191.44028: Calling groups_plugins_play to load vars for managed-node2 16380 1727204191.44698: done sending task result for task 12b410aa-8751-749c-b6eb-00000000048e 16380 1727204191.44702: WORKER PROCESS EXITING 16380 1727204191.48167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204191.53425: done with get_vars() 16380 1727204191.53471: done getting variables 16380 1727204191.53550: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204191.53893: variable 'profile' from source: play vars 16380 1727204191.53897: variable 'interface' from source: set_fact 16380 1727204191.53979: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.137) 0:00:52.646 ***** 16380 1727204191.54017: entering _queue_task() for managed-node2/command 16380 1727204191.54797: worker is 1 (out of 1 available) 16380 1727204191.54812: exiting _queue_task() for managed-node2/command 16380 1727204191.54830: done queuing things up, now waiting for results queue to drain 16380 1727204191.54833: waiting for pending results... 16380 1727204191.55374: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 16380 1727204191.55680: in run() - task 12b410aa-8751-749c-b6eb-000000000490 16380 1727204191.55699: variable 'ansible_search_path' from source: unknown 16380 1727204191.55703: variable 'ansible_search_path' from source: unknown 16380 1727204191.55744: calling self._execute() 16380 1727204191.55961: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204191.56082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204191.56096: variable 'omit' from source: magic vars 16380 1727204191.57003: variable 'ansible_distribution_major_version' from source: facts 16380 1727204191.57016: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204191.57411: variable 'profile_stat' from source: set_fact 16380 1727204191.57431: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204191.57435: when evaluation is False, skipping this task 16380 1727204191.57438: _execute() done 16380 1727204191.57443: dumping result to json 16380 1727204191.57448: done dumping result, returning 16380 1727204191.57456: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000490] 16380 1727204191.57462: sending task result for task 12b410aa-8751-749c-b6eb-000000000490 16380 1727204191.57685: done sending task result for task 12b410aa-8751-749c-b6eb-000000000490 16380 1727204191.57688: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204191.57854: no more pending results, returning what we have 16380 1727204191.57857: results queue empty 16380 1727204191.57858: checking for any_errors_fatal 16380 1727204191.57870: done checking for any_errors_fatal 16380 1727204191.57871: checking for max_fail_percentage 16380 1727204191.57872: done checking for max_fail_percentage 16380 1727204191.57873: checking to see if all hosts have failed and the running result is not ok 16380 1727204191.57874: done checking to see if all hosts have failed 16380 1727204191.57875: getting the remaining hosts for this loop 16380 1727204191.57877: done getting the remaining hosts for this loop 16380 1727204191.57883: getting the next task for host managed-node2 16380 1727204191.57894: done getting next task for host managed-node2 16380 1727204191.57897: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 16380 1727204191.57901: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204191.57907: getting variables 16380 1727204191.57909: in VariableManager get_vars() 16380 1727204191.57945: Calling all_inventory to load vars for managed-node2 16380 1727204191.57950: Calling groups_inventory to load vars for managed-node2 16380 1727204191.57955: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204191.57970: Calling all_plugins_play to load vars for managed-node2 16380 1727204191.57974: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204191.57979: Calling groups_plugins_play to load vars for managed-node2 16380 1727204191.61884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204191.66528: done with get_vars() 16380 1727204191.66566: done getting variables 16380 1727204191.66849: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204191.67196: variable 'profile' from source: play vars 16380 1727204191.67200: variable 'interface' from source: set_fact 16380 1727204191.67281: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.133) 0:00:52.779 ***** 16380 1727204191.67324: entering _queue_task() for managed-node2/set_fact 16380 1727204191.68317: worker is 1 (out of 1 available) 16380 1727204191.68334: exiting _queue_task() for managed-node2/set_fact 16380 1727204191.68351: done queuing things up, now waiting for results queue to drain 16380 1727204191.68353: waiting for pending results... 16380 1727204191.68800: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 16380 1727204191.69129: in run() - task 12b410aa-8751-749c-b6eb-000000000491 16380 1727204191.69207: variable 'ansible_search_path' from source: unknown 16380 1727204191.69213: variable 'ansible_search_path' from source: unknown 16380 1727204191.69366: calling self._execute() 16380 1727204191.69588: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204191.69598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204191.69611: variable 'omit' from source: magic vars 16380 1727204191.70445: variable 'ansible_distribution_major_version' from source: facts 16380 1727204191.70458: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204191.70725: variable 'profile_stat' from source: set_fact 16380 1727204191.70743: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204191.70747: when evaluation is False, skipping this task 16380 1727204191.70750: _execute() done 16380 1727204191.70872: dumping result to json 16380 1727204191.70876: done dumping result, returning 16380 1727204191.70885: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000491] 16380 1727204191.70892: sending task result for task 12b410aa-8751-749c-b6eb-000000000491 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204191.71143: no more pending results, returning what we have 16380 1727204191.71147: results queue empty 16380 1727204191.71148: checking for any_errors_fatal 16380 1727204191.71156: done checking for any_errors_fatal 16380 1727204191.71157: checking for max_fail_percentage 16380 1727204191.71159: done checking for max_fail_percentage 16380 1727204191.71159: checking to see if all hosts have failed and the running result is not ok 16380 1727204191.71160: done checking to see if all hosts have failed 16380 1727204191.71161: getting the remaining hosts for this loop 16380 1727204191.71164: done getting the remaining hosts for this loop 16380 1727204191.71168: getting the next task for host managed-node2 16380 1727204191.71177: done getting next task for host managed-node2 16380 1727204191.71180: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 16380 1727204191.71185: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204191.71192: getting variables 16380 1727204191.71194: in VariableManager get_vars() 16380 1727204191.71326: Calling all_inventory to load vars for managed-node2 16380 1727204191.71330: Calling groups_inventory to load vars for managed-node2 16380 1727204191.71334: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204191.71349: Calling all_plugins_play to load vars for managed-node2 16380 1727204191.71352: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204191.71356: Calling groups_plugins_play to load vars for managed-node2 16380 1727204191.72331: done sending task result for task 12b410aa-8751-749c-b6eb-000000000491 16380 1727204191.72335: WORKER PROCESS EXITING 16380 1727204191.74679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204191.77780: done with get_vars() 16380 1727204191.77821: done getting variables 16380 1727204191.78106: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204191.78236: variable 'profile' from source: play vars 16380 1727204191.78240: variable 'interface' from source: set_fact 16380 1727204191.78513: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.112) 0:00:52.892 ***** 16380 1727204191.78552: entering _queue_task() for managed-node2/command 16380 1727204191.79343: worker is 1 (out of 1 available) 16380 1727204191.79359: exiting _queue_task() for managed-node2/command 16380 1727204191.79373: done queuing things up, now waiting for results queue to drain 16380 1727204191.79375: waiting for pending results... 16380 1727204191.79777: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 16380 1727204191.80111: in run() - task 12b410aa-8751-749c-b6eb-000000000492 16380 1727204191.80129: variable 'ansible_search_path' from source: unknown 16380 1727204191.80133: variable 'ansible_search_path' from source: unknown 16380 1727204191.80174: calling self._execute() 16380 1727204191.80287: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204191.80499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204191.80513: variable 'omit' from source: magic vars 16380 1727204191.81323: variable 'ansible_distribution_major_version' from source: facts 16380 1727204191.81336: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204191.81494: variable 'profile_stat' from source: set_fact 16380 1727204191.81714: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204191.81717: when evaluation is False, skipping this task 16380 1727204191.81723: _execute() done 16380 1727204191.81726: dumping result to json 16380 1727204191.81728: done dumping result, returning 16380 1727204191.81740: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000492] 16380 1727204191.81746: sending task result for task 12b410aa-8751-749c-b6eb-000000000492 16380 1727204191.81891: done sending task result for task 12b410aa-8751-749c-b6eb-000000000492 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204191.81956: no more pending results, returning what we have 16380 1727204191.81961: results queue empty 16380 1727204191.81962: checking for any_errors_fatal 16380 1727204191.81971: done checking for any_errors_fatal 16380 1727204191.81973: checking for max_fail_percentage 16380 1727204191.81975: done checking for max_fail_percentage 16380 1727204191.81975: checking to see if all hosts have failed and the running result is not ok 16380 1727204191.81976: done checking to see if all hosts have failed 16380 1727204191.81978: getting the remaining hosts for this loop 16380 1727204191.81980: done getting the remaining hosts for this loop 16380 1727204191.81985: getting the next task for host managed-node2 16380 1727204191.81995: done getting next task for host managed-node2 16380 1727204191.81999: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 16380 1727204191.82004: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204191.82009: getting variables 16380 1727204191.82011: in VariableManager get_vars() 16380 1727204191.82048: Calling all_inventory to load vars for managed-node2 16380 1727204191.82052: Calling groups_inventory to load vars for managed-node2 16380 1727204191.82057: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204191.82073: Calling all_plugins_play to load vars for managed-node2 16380 1727204191.82077: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204191.82080: Calling groups_plugins_play to load vars for managed-node2 16380 1727204191.82094: WORKER PROCESS EXITING 16380 1727204191.87115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204191.93051: done with get_vars() 16380 1727204191.93099: done getting variables 16380 1727204191.93173: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204191.93613: variable 'profile' from source: play vars 16380 1727204191.93620: variable 'interface' from source: set_fact 16380 1727204191.93894: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.153) 0:00:53.045 ***** 16380 1727204191.93933: entering _queue_task() for managed-node2/set_fact 16380 1727204191.94728: worker is 1 (out of 1 available) 16380 1727204191.94740: exiting _queue_task() for managed-node2/set_fact 16380 1727204191.94752: done queuing things up, now waiting for results queue to drain 16380 1727204191.94755: waiting for pending results... 16380 1727204191.95110: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 16380 1727204191.95440: in run() - task 12b410aa-8751-749c-b6eb-000000000493 16380 1727204191.95458: variable 'ansible_search_path' from source: unknown 16380 1727204191.95463: variable 'ansible_search_path' from source: unknown 16380 1727204191.95504: calling self._execute() 16380 1727204191.95809: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204191.95817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204191.95852: variable 'omit' from source: magic vars 16380 1727204191.96765: variable 'ansible_distribution_major_version' from source: facts 16380 1727204191.96769: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204191.96791: variable 'profile_stat' from source: set_fact 16380 1727204191.97084: Evaluated conditional (profile_stat.stat.exists): False 16380 1727204191.97088: when evaluation is False, skipping this task 16380 1727204191.97092: _execute() done 16380 1727204191.97179: dumping result to json 16380 1727204191.97183: done dumping result, returning 16380 1727204191.97193: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [12b410aa-8751-749c-b6eb-000000000493] 16380 1727204191.97199: sending task result for task 12b410aa-8751-749c-b6eb-000000000493 16380 1727204191.97304: done sending task result for task 12b410aa-8751-749c-b6eb-000000000493 16380 1727204191.97308: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 16380 1727204191.97374: no more pending results, returning what we have 16380 1727204191.97378: results queue empty 16380 1727204191.97380: checking for any_errors_fatal 16380 1727204191.97393: done checking for any_errors_fatal 16380 1727204191.97394: checking for max_fail_percentage 16380 1727204191.97395: done checking for max_fail_percentage 16380 1727204191.97396: checking to see if all hosts have failed and the running result is not ok 16380 1727204191.97397: done checking to see if all hosts have failed 16380 1727204191.97398: getting the remaining hosts for this loop 16380 1727204191.97400: done getting the remaining hosts for this loop 16380 1727204191.97404: getting the next task for host managed-node2 16380 1727204191.97414: done getting next task for host managed-node2 16380 1727204191.97420: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 16380 1727204191.97423: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204191.97429: getting variables 16380 1727204191.97431: in VariableManager get_vars() 16380 1727204191.97466: Calling all_inventory to load vars for managed-node2 16380 1727204191.97468: Calling groups_inventory to load vars for managed-node2 16380 1727204191.97473: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204191.97488: Calling all_plugins_play to load vars for managed-node2 16380 1727204191.97533: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204191.97539: Calling groups_plugins_play to load vars for managed-node2 16380 1727204192.03056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204192.09477: done with get_vars() 16380 1727204192.09516: done getting variables 16380 1727204192.09588: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204192.09927: variable 'profile' from source: play vars 16380 1727204192.09932: variable 'interface' from source: set_fact 16380 1727204192.10205: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.163) 0:00:53.209 ***** 16380 1727204192.10241: entering _queue_task() for managed-node2/assert 16380 1727204192.11040: worker is 1 (out of 1 available) 16380 1727204192.11054: exiting _queue_task() for managed-node2/assert 16380 1727204192.11069: done queuing things up, now waiting for results queue to drain 16380 1727204192.11071: waiting for pending results... 16380 1727204192.11511: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' 16380 1727204192.11738: in run() - task 12b410aa-8751-749c-b6eb-000000000480 16380 1727204192.11743: variable 'ansible_search_path' from source: unknown 16380 1727204192.11746: variable 'ansible_search_path' from source: unknown 16380 1727204192.11780: calling self._execute() 16380 1727204192.11956: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204192.12100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204192.12113: variable 'omit' from source: magic vars 16380 1727204192.12843: variable 'ansible_distribution_major_version' from source: facts 16380 1727204192.12847: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204192.12855: variable 'omit' from source: magic vars 16380 1727204192.13010: variable 'omit' from source: magic vars 16380 1727204192.13341: variable 'profile' from source: play vars 16380 1727204192.13345: variable 'interface' from source: set_fact 16380 1727204192.13430: variable 'interface' from source: set_fact 16380 1727204192.13443: variable 'omit' from source: magic vars 16380 1727204192.13696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204192.13742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204192.13767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204192.13789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204192.13807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204192.13843: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204192.13847: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204192.13850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204192.14182: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204192.14261: Set connection var ansible_shell_executable to /bin/sh 16380 1727204192.14265: Set connection var ansible_connection to ssh 16380 1727204192.14268: Set connection var ansible_shell_type to sh 16380 1727204192.14270: Set connection var ansible_pipelining to False 16380 1727204192.14273: Set connection var ansible_timeout to 10 16380 1727204192.14275: variable 'ansible_shell_executable' from source: unknown 16380 1727204192.14278: variable 'ansible_connection' from source: unknown 16380 1727204192.14280: variable 'ansible_module_compression' from source: unknown 16380 1727204192.14282: variable 'ansible_shell_type' from source: unknown 16380 1727204192.14284: variable 'ansible_shell_executable' from source: unknown 16380 1727204192.14286: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204192.14288: variable 'ansible_pipelining' from source: unknown 16380 1727204192.14293: variable 'ansible_timeout' from source: unknown 16380 1727204192.14295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204192.14700: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204192.14703: variable 'omit' from source: magic vars 16380 1727204192.14706: starting attempt loop 16380 1727204192.14708: running the handler 16380 1727204192.14916: variable 'lsr_net_profile_exists' from source: set_fact 16380 1727204192.14923: Evaluated conditional (not lsr_net_profile_exists): True 16380 1727204192.14994: handler run complete 16380 1727204192.14998: attempt loop complete, returning result 16380 1727204192.15001: _execute() done 16380 1727204192.15003: dumping result to json 16380 1727204192.15006: done dumping result, returning 16380 1727204192.15008: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' [12b410aa-8751-749c-b6eb-000000000480] 16380 1727204192.15010: sending task result for task 12b410aa-8751-749c-b6eb-000000000480 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204192.15135: no more pending results, returning what we have 16380 1727204192.15140: results queue empty 16380 1727204192.15141: checking for any_errors_fatal 16380 1727204192.15149: done checking for any_errors_fatal 16380 1727204192.15150: checking for max_fail_percentage 16380 1727204192.15152: done checking for max_fail_percentage 16380 1727204192.15153: checking to see if all hosts have failed and the running result is not ok 16380 1727204192.15154: done checking to see if all hosts have failed 16380 1727204192.15155: getting the remaining hosts for this loop 16380 1727204192.15157: done getting the remaining hosts for this loop 16380 1727204192.15162: getting the next task for host managed-node2 16380 1727204192.15173: done getting next task for host managed-node2 16380 1727204192.15176: ^ task is: TASK: meta (flush_handlers) 16380 1727204192.15179: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204192.15185: getting variables 16380 1727204192.15187: in VariableManager get_vars() 16380 1727204192.15225: Calling all_inventory to load vars for managed-node2 16380 1727204192.15229: Calling groups_inventory to load vars for managed-node2 16380 1727204192.15233: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204192.15246: Calling all_plugins_play to load vars for managed-node2 16380 1727204192.15250: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204192.15253: Calling groups_plugins_play to load vars for managed-node2 16380 1727204192.16496: done sending task result for task 12b410aa-8751-749c-b6eb-000000000480 16380 1727204192.16499: WORKER PROCESS EXITING 16380 1727204192.19643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204192.25679: done with get_vars() 16380 1727204192.25941: done getting variables 16380 1727204192.26246: in VariableManager get_vars() 16380 1727204192.26260: Calling all_inventory to load vars for managed-node2 16380 1727204192.26263: Calling groups_inventory to load vars for managed-node2 16380 1727204192.26267: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204192.26273: Calling all_plugins_play to load vars for managed-node2 16380 1727204192.26277: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204192.26281: Calling groups_plugins_play to load vars for managed-node2 16380 1727204192.28701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204192.32665: done with get_vars() 16380 1727204192.32730: done queuing things up, now waiting for results queue to drain 16380 1727204192.32733: results queue empty 16380 1727204192.32734: checking for any_errors_fatal 16380 1727204192.32738: done checking for any_errors_fatal 16380 1727204192.32739: checking for max_fail_percentage 16380 1727204192.32740: done checking for max_fail_percentage 16380 1727204192.32741: checking to see if all hosts have failed and the running result is not ok 16380 1727204192.32754: done checking to see if all hosts have failed 16380 1727204192.32755: getting the remaining hosts for this loop 16380 1727204192.32757: done getting the remaining hosts for this loop 16380 1727204192.32761: getting the next task for host managed-node2 16380 1727204192.32767: done getting next task for host managed-node2 16380 1727204192.32769: ^ task is: TASK: meta (flush_handlers) 16380 1727204192.32771: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204192.32774: getting variables 16380 1727204192.32775: in VariableManager get_vars() 16380 1727204192.32788: Calling all_inventory to load vars for managed-node2 16380 1727204192.32794: Calling groups_inventory to load vars for managed-node2 16380 1727204192.32797: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204192.32804: Calling all_plugins_play to load vars for managed-node2 16380 1727204192.32807: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204192.32810: Calling groups_plugins_play to load vars for managed-node2 16380 1727204192.35479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204192.38991: done with get_vars() 16380 1727204192.39029: done getting variables 16380 1727204192.39097: in VariableManager get_vars() 16380 1727204192.39110: Calling all_inventory to load vars for managed-node2 16380 1727204192.39113: Calling groups_inventory to load vars for managed-node2 16380 1727204192.39117: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204192.39126: Calling all_plugins_play to load vars for managed-node2 16380 1727204192.39129: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204192.39133: Calling groups_plugins_play to load vars for managed-node2 16380 1727204192.41222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204192.45410: done with get_vars() 16380 1727204192.45459: done queuing things up, now waiting for results queue to drain 16380 1727204192.45462: results queue empty 16380 1727204192.45464: checking for any_errors_fatal 16380 1727204192.45465: done checking for any_errors_fatal 16380 1727204192.45466: checking for max_fail_percentage 16380 1727204192.45468: done checking for max_fail_percentage 16380 1727204192.45469: checking to see if all hosts have failed and the running result is not ok 16380 1727204192.45470: done checking to see if all hosts have failed 16380 1727204192.45471: getting the remaining hosts for this loop 16380 1727204192.45472: done getting the remaining hosts for this loop 16380 1727204192.45475: getting the next task for host managed-node2 16380 1727204192.45480: done getting next task for host managed-node2 16380 1727204192.45481: ^ task is: None 16380 1727204192.45483: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204192.45484: done queuing things up, now waiting for results queue to drain 16380 1727204192.45485: results queue empty 16380 1727204192.45486: checking for any_errors_fatal 16380 1727204192.45487: done checking for any_errors_fatal 16380 1727204192.45488: checking for max_fail_percentage 16380 1727204192.45492: done checking for max_fail_percentage 16380 1727204192.45493: checking to see if all hosts have failed and the running result is not ok 16380 1727204192.45494: done checking to see if all hosts have failed 16380 1727204192.45495: getting the next task for host managed-node2 16380 1727204192.45498: done getting next task for host managed-node2 16380 1727204192.45499: ^ task is: None 16380 1727204192.45501: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204192.45554: in VariableManager get_vars() 16380 1727204192.45577: done with get_vars() 16380 1727204192.45585: in VariableManager get_vars() 16380 1727204192.45600: done with get_vars() 16380 1727204192.45606: variable 'omit' from source: magic vars 16380 1727204192.45742: variable 'task' from source: play vars 16380 1727204192.45781: in VariableManager get_vars() 16380 1727204192.45798: done with get_vars() 16380 1727204192.45827: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 16380 1727204192.46316: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204192.46342: getting the remaining hosts for this loop 16380 1727204192.46344: done getting the remaining hosts for this loop 16380 1727204192.46347: getting the next task for host managed-node2 16380 1727204192.46351: done getting next task for host managed-node2 16380 1727204192.46354: ^ task is: TASK: Gathering Facts 16380 1727204192.46356: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204192.46358: getting variables 16380 1727204192.46373: in VariableManager get_vars() 16380 1727204192.46384: Calling all_inventory to load vars for managed-node2 16380 1727204192.46387: Calling groups_inventory to load vars for managed-node2 16380 1727204192.46392: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204192.46399: Calling all_plugins_play to load vars for managed-node2 16380 1727204192.46402: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204192.46407: Calling groups_plugins_play to load vars for managed-node2 16380 1727204192.58973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204192.63909: done with get_vars() 16380 1727204192.63955: done getting variables 16380 1727204192.64052: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.538) 0:00:53.747 ***** 16380 1727204192.64082: entering _queue_task() for managed-node2/gather_facts 16380 1727204192.64684: worker is 1 (out of 1 available) 16380 1727204192.64697: exiting _queue_task() for managed-node2/gather_facts 16380 1727204192.64710: done queuing things up, now waiting for results queue to drain 16380 1727204192.64712: waiting for pending results... 16380 1727204192.64952: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204192.64982: in run() - task 12b410aa-8751-749c-b6eb-0000000004c5 16380 1727204192.65012: variable 'ansible_search_path' from source: unknown 16380 1727204192.65069: calling self._execute() 16380 1727204192.65195: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204192.65212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204192.65234: variable 'omit' from source: magic vars 16380 1727204192.66070: variable 'ansible_distribution_major_version' from source: facts 16380 1727204192.66088: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204192.66176: variable 'omit' from source: magic vars 16380 1727204192.66231: variable 'omit' from source: magic vars 16380 1727204192.66293: variable 'omit' from source: magic vars 16380 1727204192.66348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204192.66407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204192.66473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204192.66476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204192.66482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204192.66531: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204192.66542: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204192.66553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204192.66713: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204192.66803: Set connection var ansible_shell_executable to /bin/sh 16380 1727204192.66806: Set connection var ansible_connection to ssh 16380 1727204192.66809: Set connection var ansible_shell_type to sh 16380 1727204192.66812: Set connection var ansible_pipelining to False 16380 1727204192.66815: Set connection var ansible_timeout to 10 16380 1727204192.66832: variable 'ansible_shell_executable' from source: unknown 16380 1727204192.66842: variable 'ansible_connection' from source: unknown 16380 1727204192.66851: variable 'ansible_module_compression' from source: unknown 16380 1727204192.66860: variable 'ansible_shell_type' from source: unknown 16380 1727204192.66868: variable 'ansible_shell_executable' from source: unknown 16380 1727204192.66877: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204192.66888: variable 'ansible_pipelining' from source: unknown 16380 1727204192.66903: variable 'ansible_timeout' from source: unknown 16380 1727204192.66917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204192.67160: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204192.67235: variable 'omit' from source: magic vars 16380 1727204192.67239: starting attempt loop 16380 1727204192.67242: running the handler 16380 1727204192.67245: variable 'ansible_facts' from source: unknown 16380 1727204192.67247: _low_level_execute_command(): starting 16380 1727204192.67258: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204192.68377: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204192.68537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204192.68544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204192.68593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204192.68728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204192.70526: stdout chunk (state=3): >>>/root <<< 16380 1727204192.70755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204192.70759: stdout chunk (state=3): >>><<< 16380 1727204192.70762: stderr chunk (state=3): >>><<< 16380 1727204192.70825: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204192.70845: _low_level_execute_command(): starting 16380 1727204192.70941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431 `" && echo ansible-tmp-1727204192.7083166-19861-46837191645431="` echo /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431 `" ) && sleep 0' 16380 1727204192.72431: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204192.72462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204192.72531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204192.72570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204192.72593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204192.72650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204192.74794: stdout chunk (state=3): >>>ansible-tmp-1727204192.7083166-19861-46837191645431=/root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431 <<< 16380 1727204192.74962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204192.74965: stdout chunk (state=3): >>><<< 16380 1727204192.74969: stderr chunk (state=3): >>><<< 16380 1727204192.75022: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204192.7083166-19861-46837191645431=/root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204192.75042: variable 'ansible_module_compression' from source: unknown 16380 1727204192.75231: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204192.75235: variable 'ansible_facts' from source: unknown 16380 1727204192.75292: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py 16380 1727204192.75425: Sending initial data 16380 1727204192.75429: Sent initial data (153 bytes) 16380 1727204192.75948: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204192.76016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204192.76159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204192.77933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204192.77985: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204192.78039: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmptf9b_wdb /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py <<< 16380 1727204192.78042: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py" <<< 16380 1727204192.78045: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmptf9b_wdb" to remote "/root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py" <<< 16380 1727204192.82688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204192.82694: stderr chunk (state=3): >>><<< 16380 1727204192.82697: stdout chunk (state=3): >>><<< 16380 1727204192.82798: done transferring module to remote 16380 1727204192.82806: _low_level_execute_command(): starting 16380 1727204192.82809: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/ /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py && sleep 0' 16380 1727204192.84032: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204192.84069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204192.84335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204192.84403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204192.84445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204192.86412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204192.86625: stderr chunk (state=3): >>><<< 16380 1727204192.86629: stdout chunk (state=3): >>><<< 16380 1727204192.86632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204192.86639: _low_level_execute_command(): starting 16380 1727204192.86642: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/AnsiballZ_setup.py && sleep 0' 16380 1727204192.87764: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204192.87768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204192.87771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204192.87774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204192.88003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204192.88129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204192.88181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204193.57103: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.73876953125, "5m": 0.5849609375, "15m": 0.36669921875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "33", "epoch": "1727204193", "epoch_int": "1727204193", "date": "2024-09-24", "time": "14:56:33", "iso8601_micro": "2024-09-24T18:56:33.200514Z", "iso8601": "2024-09-24T18:56:33Z", "iso8601_basic": "20240924T145633200514", "iso8601_basic_short": "20240924T145633", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline"<<< 16380 1727204193.57134: stdout chunk (state=3): >>>: {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2843, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 874, "free": 2843}, "nocache": {"free": 3472, "used": 245}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 697, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147411456, "block_size": 4096, "block_total": 64479564, "block_available": 61315286, "block_used": 3164278, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204193.59259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204193.59328: stderr chunk (state=3): >>><<< 16380 1727204193.59332: stdout chunk (state=3): >>><<< 16380 1727204193.59499: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.73876953125, "5m": 0.5849609375, "15m": 0.36669921875}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "33", "epoch": "1727204193", "epoch_int": "1727204193", "date": "2024-09-24", "time": "14:56:33", "iso8601_micro": "2024-09-24T18:56:33.200514Z", "iso8601": "2024-09-24T18:56:33Z", "iso8601_basic": "20240924T145633200514", "iso8601_basic_short": "20240924T145633", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2843, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 874, "free": 2843}, "nocache": {"free": 3472, "used": 245}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 697, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147411456, "block_size": 4096, "block_total": 64479564, "block_available": 61315286, "block_used": 3164278, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204193.59829: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204193.59907: _low_level_execute_command(): starting 16380 1727204193.59925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204192.7083166-19861-46837191645431/ > /dev/null 2>&1 && sleep 0' 16380 1727204193.61385: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204193.61438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204193.61450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204193.61537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204193.63615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204193.63804: stderr chunk (state=3): >>><<< 16380 1727204193.63811: stdout chunk (state=3): >>><<< 16380 1727204193.63858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204193.63862: handler run complete 16380 1727204193.64141: variable 'ansible_facts' from source: unknown 16380 1727204193.64262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.64543: variable 'ansible_facts' from source: unknown 16380 1727204193.64626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.64733: attempt loop complete, returning result 16380 1727204193.64736: _execute() done 16380 1727204193.64739: dumping result to json 16380 1727204193.64762: done dumping result, returning 16380 1727204193.64770: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-0000000004c5] 16380 1727204193.64775: sending task result for task 12b410aa-8751-749c-b6eb-0000000004c5 16380 1727204193.65093: done sending task result for task 12b410aa-8751-749c-b6eb-0000000004c5 16380 1727204193.65096: WORKER PROCESS EXITING ok: [managed-node2] 16380 1727204193.65486: no more pending results, returning what we have 16380 1727204193.65497: results queue empty 16380 1727204193.65501: checking for any_errors_fatal 16380 1727204193.65505: done checking for any_errors_fatal 16380 1727204193.65506: checking for max_fail_percentage 16380 1727204193.65508: done checking for max_fail_percentage 16380 1727204193.65509: checking to see if all hosts have failed and the running result is not ok 16380 1727204193.65510: done checking to see if all hosts have failed 16380 1727204193.65511: getting the remaining hosts for this loop 16380 1727204193.65512: done getting the remaining hosts for this loop 16380 1727204193.65518: getting the next task for host managed-node2 16380 1727204193.65526: done getting next task for host managed-node2 16380 1727204193.65529: ^ task is: TASK: meta (flush_handlers) 16380 1727204193.65531: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204193.65536: getting variables 16380 1727204193.65537: in VariableManager get_vars() 16380 1727204193.65566: Calling all_inventory to load vars for managed-node2 16380 1727204193.65572: Calling groups_inventory to load vars for managed-node2 16380 1727204193.65576: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204193.65588: Calling all_plugins_play to load vars for managed-node2 16380 1727204193.65595: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204193.65598: Calling groups_plugins_play to load vars for managed-node2 16380 1727204193.67761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.69691: done with get_vars() 16380 1727204193.69734: done getting variables 16380 1727204193.69823: in VariableManager get_vars() 16380 1727204193.69838: Calling all_inventory to load vars for managed-node2 16380 1727204193.69841: Calling groups_inventory to load vars for managed-node2 16380 1727204193.69845: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204193.69851: Calling all_plugins_play to load vars for managed-node2 16380 1727204193.69854: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204193.69859: Calling groups_plugins_play to load vars for managed-node2 16380 1727204193.71897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.74869: done with get_vars() 16380 1727204193.74916: done queuing things up, now waiting for results queue to drain 16380 1727204193.74919: results queue empty 16380 1727204193.74920: checking for any_errors_fatal 16380 1727204193.74926: done checking for any_errors_fatal 16380 1727204193.74927: checking for max_fail_percentage 16380 1727204193.74928: done checking for max_fail_percentage 16380 1727204193.74929: checking to see if all hosts have failed and the running result is not ok 16380 1727204193.74930: done checking to see if all hosts have failed 16380 1727204193.74931: getting the remaining hosts for this loop 16380 1727204193.74937: done getting the remaining hosts for this loop 16380 1727204193.74940: getting the next task for host managed-node2 16380 1727204193.74945: done getting next task for host managed-node2 16380 1727204193.74948: ^ task is: TASK: Include the task '{{ task }}' 16380 1727204193.74950: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204193.74953: getting variables 16380 1727204193.74955: in VariableManager get_vars() 16380 1727204193.74967: Calling all_inventory to load vars for managed-node2 16380 1727204193.74969: Calling groups_inventory to load vars for managed-node2 16380 1727204193.74973: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204193.74979: Calling all_plugins_play to load vars for managed-node2 16380 1727204193.74982: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204193.74986: Calling groups_plugins_play to load vars for managed-node2 16380 1727204193.76650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.79262: done with get_vars() 16380 1727204193.79308: done getting variables 16380 1727204193.79531: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:33 -0400 (0:00:01.154) 0:00:54.902 ***** 16380 1727204193.79568: entering _queue_task() for managed-node2/include_tasks 16380 1727204193.79976: worker is 1 (out of 1 available) 16380 1727204193.80195: exiting _queue_task() for managed-node2/include_tasks 16380 1727204193.80208: done queuing things up, now waiting for results queue to drain 16380 1727204193.80210: waiting for pending results... 16380 1727204193.80607: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_absent.yml' 16380 1727204193.80615: in run() - task 12b410aa-8751-749c-b6eb-000000000077 16380 1727204193.80622: variable 'ansible_search_path' from source: unknown 16380 1727204193.80626: calling self._execute() 16380 1727204193.80651: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204193.80664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204193.80679: variable 'omit' from source: magic vars 16380 1727204193.81141: variable 'ansible_distribution_major_version' from source: facts 16380 1727204193.81165: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204193.81179: variable 'task' from source: play vars 16380 1727204193.81380: variable 'task' from source: play vars 16380 1727204193.81384: _execute() done 16380 1727204193.81387: dumping result to json 16380 1727204193.81391: done dumping result, returning 16380 1727204193.81394: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_absent.yml' [12b410aa-8751-749c-b6eb-000000000077] 16380 1727204193.81397: sending task result for task 12b410aa-8751-749c-b6eb-000000000077 16380 1727204193.81477: done sending task result for task 12b410aa-8751-749c-b6eb-000000000077 16380 1727204193.81480: WORKER PROCESS EXITING 16380 1727204193.81517: no more pending results, returning what we have 16380 1727204193.81524: in VariableManager get_vars() 16380 1727204193.81563: Calling all_inventory to load vars for managed-node2 16380 1727204193.81567: Calling groups_inventory to load vars for managed-node2 16380 1727204193.81571: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204193.81587: Calling all_plugins_play to load vars for managed-node2 16380 1727204193.81593: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204193.81597: Calling groups_plugins_play to load vars for managed-node2 16380 1727204193.84265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.87243: done with get_vars() 16380 1727204193.87276: variable 'ansible_search_path' from source: unknown 16380 1727204193.87295: we have included files to process 16380 1727204193.87296: generating all_blocks data 16380 1727204193.87298: done generating all_blocks data 16380 1727204193.87299: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 16380 1727204193.87300: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 16380 1727204193.87303: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 16380 1727204193.87446: in VariableManager get_vars() 16380 1727204193.87468: done with get_vars() 16380 1727204193.87614: done processing included file 16380 1727204193.87616: iterating over new_blocks loaded from include file 16380 1727204193.87620: in VariableManager get_vars() 16380 1727204193.87636: done with get_vars() 16380 1727204193.87637: filtering new block on tags 16380 1727204193.87665: done filtering new block on tags 16380 1727204193.87668: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 16380 1727204193.87675: extending task lists for all hosts with included blocks 16380 1727204193.87722: done extending task lists 16380 1727204193.87723: done processing included files 16380 1727204193.87724: results queue empty 16380 1727204193.87725: checking for any_errors_fatal 16380 1727204193.87727: done checking for any_errors_fatal 16380 1727204193.87728: checking for max_fail_percentage 16380 1727204193.87729: done checking for max_fail_percentage 16380 1727204193.87730: checking to see if all hosts have failed and the running result is not ok 16380 1727204193.87731: done checking to see if all hosts have failed 16380 1727204193.87732: getting the remaining hosts for this loop 16380 1727204193.87733: done getting the remaining hosts for this loop 16380 1727204193.87736: getting the next task for host managed-node2 16380 1727204193.87741: done getting next task for host managed-node2 16380 1727204193.87744: ^ task is: TASK: Include the task 'get_interface_stat.yml' 16380 1727204193.87746: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204193.87749: getting variables 16380 1727204193.87750: in VariableManager get_vars() 16380 1727204193.87761: Calling all_inventory to load vars for managed-node2 16380 1727204193.87764: Calling groups_inventory to load vars for managed-node2 16380 1727204193.87766: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204193.87772: Calling all_plugins_play to load vars for managed-node2 16380 1727204193.87775: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204193.87778: Calling groups_plugins_play to load vars for managed-node2 16380 1727204193.89714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.92542: done with get_vars() 16380 1727204193.92580: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.130) 0:00:55.033 ***** 16380 1727204193.92668: entering _queue_task() for managed-node2/include_tasks 16380 1727204193.93042: worker is 1 (out of 1 available) 16380 1727204193.93057: exiting _queue_task() for managed-node2/include_tasks 16380 1727204193.93073: done queuing things up, now waiting for results queue to drain 16380 1727204193.93075: waiting for pending results... 16380 1727204193.93509: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 16380 1727204193.93520: in run() - task 12b410aa-8751-749c-b6eb-0000000004d6 16380 1727204193.93542: variable 'ansible_search_path' from source: unknown 16380 1727204193.93552: variable 'ansible_search_path' from source: unknown 16380 1727204193.93603: calling self._execute() 16380 1727204193.93721: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204193.93737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204193.93756: variable 'omit' from source: magic vars 16380 1727204193.94221: variable 'ansible_distribution_major_version' from source: facts 16380 1727204193.94266: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204193.94270: _execute() done 16380 1727204193.94273: dumping result to json 16380 1727204193.94276: done dumping result, returning 16380 1727204193.94282: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-749c-b6eb-0000000004d6] 16380 1727204193.94498: sending task result for task 12b410aa-8751-749c-b6eb-0000000004d6 16380 1727204193.94568: done sending task result for task 12b410aa-8751-749c-b6eb-0000000004d6 16380 1727204193.94571: WORKER PROCESS EXITING 16380 1727204193.94604: no more pending results, returning what we have 16380 1727204193.94609: in VariableManager get_vars() 16380 1727204193.94647: Calling all_inventory to load vars for managed-node2 16380 1727204193.94650: Calling groups_inventory to load vars for managed-node2 16380 1727204193.94655: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204193.94670: Calling all_plugins_play to load vars for managed-node2 16380 1727204193.94674: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204193.94679: Calling groups_plugins_play to load vars for managed-node2 16380 1727204193.96992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204193.99939: done with get_vars() 16380 1727204193.99975: variable 'ansible_search_path' from source: unknown 16380 1727204193.99976: variable 'ansible_search_path' from source: unknown 16380 1727204193.99987: variable 'task' from source: play vars 16380 1727204194.00112: variable 'task' from source: play vars 16380 1727204194.00155: we have included files to process 16380 1727204194.00157: generating all_blocks data 16380 1727204194.00159: done generating all_blocks data 16380 1727204194.00160: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204194.00162: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204194.00164: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 16380 1727204194.00397: done processing included file 16380 1727204194.00399: iterating over new_blocks loaded from include file 16380 1727204194.00401: in VariableManager get_vars() 16380 1727204194.00418: done with get_vars() 16380 1727204194.00420: filtering new block on tags 16380 1727204194.00440: done filtering new block on tags 16380 1727204194.00442: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 16380 1727204194.00449: extending task lists for all hosts with included blocks 16380 1727204194.00579: done extending task lists 16380 1727204194.00581: done processing included files 16380 1727204194.00582: results queue empty 16380 1727204194.00583: checking for any_errors_fatal 16380 1727204194.00587: done checking for any_errors_fatal 16380 1727204194.00588: checking for max_fail_percentage 16380 1727204194.00590: done checking for max_fail_percentage 16380 1727204194.00591: checking to see if all hosts have failed and the running result is not ok 16380 1727204194.00592: done checking to see if all hosts have failed 16380 1727204194.00593: getting the remaining hosts for this loop 16380 1727204194.00595: done getting the remaining hosts for this loop 16380 1727204194.00598: getting the next task for host managed-node2 16380 1727204194.00603: done getting next task for host managed-node2 16380 1727204194.00606: ^ task is: TASK: Get stat for interface {{ interface }} 16380 1727204194.00609: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.00612: getting variables 16380 1727204194.00613: in VariableManager get_vars() 16380 1727204194.00624: Calling all_inventory to load vars for managed-node2 16380 1727204194.00627: Calling groups_inventory to load vars for managed-node2 16380 1727204194.00630: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.00637: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.00640: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.00644: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.02652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.05484: done with get_vars() 16380 1727204194.05529: done getting variables 16380 1727204194.05703: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.130) 0:00:55.163 ***** 16380 1727204194.05740: entering _queue_task() for managed-node2/stat 16380 1727204194.06130: worker is 1 (out of 1 available) 16380 1727204194.06143: exiting _queue_task() for managed-node2/stat 16380 1727204194.06156: done queuing things up, now waiting for results queue to drain 16380 1727204194.06159: waiting for pending results... 16380 1727204194.06450: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 16380 1727204194.06601: in run() - task 12b410aa-8751-749c-b6eb-0000000004e1 16380 1727204194.06628: variable 'ansible_search_path' from source: unknown 16380 1727204194.06724: variable 'ansible_search_path' from source: unknown 16380 1727204194.06728: calling self._execute() 16380 1727204194.06787: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.06804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.06819: variable 'omit' from source: magic vars 16380 1727204194.07271: variable 'ansible_distribution_major_version' from source: facts 16380 1727204194.07288: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204194.07302: variable 'omit' from source: magic vars 16380 1727204194.07365: variable 'omit' from source: magic vars 16380 1727204194.07498: variable 'interface' from source: set_fact 16380 1727204194.07525: variable 'omit' from source: magic vars 16380 1727204194.07584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204194.07639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204194.07679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204194.07716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204194.07794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204194.07798: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204194.07801: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.07803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.07919: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204194.07937: Set connection var ansible_shell_executable to /bin/sh 16380 1727204194.07950: Set connection var ansible_connection to ssh 16380 1727204194.07962: Set connection var ansible_shell_type to sh 16380 1727204194.07973: Set connection var ansible_pipelining to False 16380 1727204194.07988: Set connection var ansible_timeout to 10 16380 1727204194.08018: variable 'ansible_shell_executable' from source: unknown 16380 1727204194.08027: variable 'ansible_connection' from source: unknown 16380 1727204194.08040: variable 'ansible_module_compression' from source: unknown 16380 1727204194.08048: variable 'ansible_shell_type' from source: unknown 16380 1727204194.08146: variable 'ansible_shell_executable' from source: unknown 16380 1727204194.08149: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.08152: variable 'ansible_pipelining' from source: unknown 16380 1727204194.08154: variable 'ansible_timeout' from source: unknown 16380 1727204194.08156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.08324: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 16380 1727204194.08343: variable 'omit' from source: magic vars 16380 1727204194.08358: starting attempt loop 16380 1727204194.08368: running the handler 16380 1727204194.08392: _low_level_execute_command(): starting 16380 1727204194.08407: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204194.09218: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.09292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.09307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.09354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.09406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.11216: stdout chunk (state=3): >>>/root <<< 16380 1727204194.11354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.11426: stderr chunk (state=3): >>><<< 16380 1727204194.11454: stdout chunk (state=3): >>><<< 16380 1727204194.11476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.11556: _low_level_execute_command(): starting 16380 1727204194.11559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622 `" && echo ansible-tmp-1727204194.1148348-19901-14321251314622="` echo /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622 `" ) && sleep 0' 16380 1727204194.12181: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204194.12205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204194.12332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.12366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.12447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.14628: stdout chunk (state=3): >>>ansible-tmp-1727204194.1148348-19901-14321251314622=/root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622 <<< 16380 1727204194.14824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.14827: stderr chunk (state=3): >>><<< 16380 1727204194.14830: stdout chunk (state=3): >>><<< 16380 1727204194.14995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204194.1148348-19901-14321251314622=/root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.14999: variable 'ansible_module_compression' from source: unknown 16380 1727204194.15002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 16380 1727204194.15053: variable 'ansible_facts' from source: unknown 16380 1727204194.15153: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py 16380 1727204194.15435: Sending initial data 16380 1727204194.15439: Sent initial data (152 bytes) 16380 1727204194.16004: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204194.16020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.16103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.16125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.16144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.16222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.17960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204194.18003: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204194.18045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp_1m_m3ed /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py <<< 16380 1727204194.18062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py" <<< 16380 1727204194.18090: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 16380 1727204194.18122: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp_1m_m3ed" to remote "/root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py" <<< 16380 1727204194.19296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.19300: stderr chunk (state=3): >>><<< 16380 1727204194.19302: stdout chunk (state=3): >>><<< 16380 1727204194.19304: done transferring module to remote 16380 1727204194.19358: _low_level_execute_command(): starting 16380 1727204194.19362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/ /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py && sleep 0' 16380 1727204194.19988: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204194.20008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204194.20110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.20148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.20166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.20186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.20258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.22286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.22302: stdout chunk (state=3): >>><<< 16380 1727204194.22317: stderr chunk (state=3): >>><<< 16380 1727204194.22346: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.22361: _low_level_execute_command(): starting 16380 1727204194.22372: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/AnsiballZ_stat.py && sleep 0' 16380 1727204194.23041: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204194.23060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204194.23076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.23097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204194.23115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204194.23164: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.23242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.23271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.23295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.23387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.41182: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 16380 1727204194.42962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204194.42966: stdout chunk (state=3): >>><<< 16380 1727204194.42968: stderr chunk (state=3): >>><<< 16380 1727204194.42971: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204194.42974: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204194.42977: _low_level_execute_command(): starting 16380 1727204194.42979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204194.1148348-19901-14321251314622/ > /dev/null 2>&1 && sleep 0' 16380 1727204194.43588: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 16380 1727204194.43610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204194.43639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.43660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204194.43677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204194.43688: stderr chunk (state=3): >>>debug2: match not found <<< 16380 1727204194.43705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.43735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204194.43751: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 16380 1727204194.43855: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204194.43870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.43902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.43976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.46076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.46087: stdout chunk (state=3): >>><<< 16380 1727204194.46106: stderr chunk (state=3): >>><<< 16380 1727204194.46136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.46157: handler run complete 16380 1727204194.46192: attempt loop complete, returning result 16380 1727204194.46203: _execute() done 16380 1727204194.46297: dumping result to json 16380 1727204194.46301: done dumping result, returning 16380 1727204194.46303: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [12b410aa-8751-749c-b6eb-0000000004e1] 16380 1727204194.46305: sending task result for task 12b410aa-8751-749c-b6eb-0000000004e1 16380 1727204194.46390: done sending task result for task 12b410aa-8751-749c-b6eb-0000000004e1 16380 1727204194.46394: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 16380 1727204194.46477: no more pending results, returning what we have 16380 1727204194.46482: results queue empty 16380 1727204194.46484: checking for any_errors_fatal 16380 1727204194.46486: done checking for any_errors_fatal 16380 1727204194.46487: checking for max_fail_percentage 16380 1727204194.46491: done checking for max_fail_percentage 16380 1727204194.46492: checking to see if all hosts have failed and the running result is not ok 16380 1727204194.46493: done checking to see if all hosts have failed 16380 1727204194.46494: getting the remaining hosts for this loop 16380 1727204194.46496: done getting the remaining hosts for this loop 16380 1727204194.46501: getting the next task for host managed-node2 16380 1727204194.46513: done getting next task for host managed-node2 16380 1727204194.46517: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 16380 1727204194.46523: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.46530: getting variables 16380 1727204194.46532: in VariableManager get_vars() 16380 1727204194.46567: Calling all_inventory to load vars for managed-node2 16380 1727204194.46571: Calling groups_inventory to load vars for managed-node2 16380 1727204194.46575: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.46799: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.46806: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.46812: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.49380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.52584: done with get_vars() 16380 1727204194.52641: done getting variables 16380 1727204194.52731: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 16380 1727204194.52896: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.472) 0:00:55.636 ***** 16380 1727204194.52951: entering _queue_task() for managed-node2/assert 16380 1727204194.53614: worker is 1 (out of 1 available) 16380 1727204194.53628: exiting _queue_task() for managed-node2/assert 16380 1727204194.53640: done queuing things up, now waiting for results queue to drain 16380 1727204194.53642: waiting for pending results... 16380 1727204194.53884: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 16380 1727204194.53901: in run() - task 12b410aa-8751-749c-b6eb-0000000004d7 16380 1727204194.53940: variable 'ansible_search_path' from source: unknown 16380 1727204194.53950: variable 'ansible_search_path' from source: unknown 16380 1727204194.54012: calling self._execute() 16380 1727204194.54098: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.54106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.54117: variable 'omit' from source: magic vars 16380 1727204194.54461: variable 'ansible_distribution_major_version' from source: facts 16380 1727204194.54475: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204194.54478: variable 'omit' from source: magic vars 16380 1727204194.54520: variable 'omit' from source: magic vars 16380 1727204194.54604: variable 'interface' from source: set_fact 16380 1727204194.54622: variable 'omit' from source: magic vars 16380 1727204194.54660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204194.54693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204194.54714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204194.54733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204194.54745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204194.54776: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204194.54780: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.54783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.54873: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204194.54881: Set connection var ansible_shell_executable to /bin/sh 16380 1727204194.54891: Set connection var ansible_connection to ssh 16380 1727204194.54897: Set connection var ansible_shell_type to sh 16380 1727204194.54908: Set connection var ansible_pipelining to False 16380 1727204194.54916: Set connection var ansible_timeout to 10 16380 1727204194.54938: variable 'ansible_shell_executable' from source: unknown 16380 1727204194.54942: variable 'ansible_connection' from source: unknown 16380 1727204194.54945: variable 'ansible_module_compression' from source: unknown 16380 1727204194.54948: variable 'ansible_shell_type' from source: unknown 16380 1727204194.54952: variable 'ansible_shell_executable' from source: unknown 16380 1727204194.54955: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.54963: variable 'ansible_pipelining' from source: unknown 16380 1727204194.54966: variable 'ansible_timeout' from source: unknown 16380 1727204194.54970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.55100: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204194.55111: variable 'omit' from source: magic vars 16380 1727204194.55119: starting attempt loop 16380 1727204194.55128: running the handler 16380 1727204194.55257: variable 'interface_stat' from source: set_fact 16380 1727204194.55267: Evaluated conditional (not interface_stat.stat.exists): True 16380 1727204194.55274: handler run complete 16380 1727204194.55291: attempt loop complete, returning result 16380 1727204194.55297: _execute() done 16380 1727204194.55302: dumping result to json 16380 1727204194.55305: done dumping result, returning 16380 1727204194.55314: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [12b410aa-8751-749c-b6eb-0000000004d7] 16380 1727204194.55322: sending task result for task 12b410aa-8751-749c-b6eb-0000000004d7 16380 1727204194.55416: done sending task result for task 12b410aa-8751-749c-b6eb-0000000004d7 16380 1727204194.55421: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 16380 1727204194.55505: no more pending results, returning what we have 16380 1727204194.55509: results queue empty 16380 1727204194.55510: checking for any_errors_fatal 16380 1727204194.55521: done checking for any_errors_fatal 16380 1727204194.55522: checking for max_fail_percentage 16380 1727204194.55523: done checking for max_fail_percentage 16380 1727204194.55524: checking to see if all hosts have failed and the running result is not ok 16380 1727204194.55525: done checking to see if all hosts have failed 16380 1727204194.55527: getting the remaining hosts for this loop 16380 1727204194.55528: done getting the remaining hosts for this loop 16380 1727204194.55534: getting the next task for host managed-node2 16380 1727204194.55544: done getting next task for host managed-node2 16380 1727204194.55546: ^ task is: TASK: meta (flush_handlers) 16380 1727204194.55548: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.55553: getting variables 16380 1727204194.55556: in VariableManager get_vars() 16380 1727204194.55586: Calling all_inventory to load vars for managed-node2 16380 1727204194.55591: Calling groups_inventory to load vars for managed-node2 16380 1727204194.55595: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.55606: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.55609: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.55612: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.57853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.59838: done with get_vars() 16380 1727204194.59866: done getting variables 16380 1727204194.59931: in VariableManager get_vars() 16380 1727204194.59942: Calling all_inventory to load vars for managed-node2 16380 1727204194.59944: Calling groups_inventory to load vars for managed-node2 16380 1727204194.59946: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.59952: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.59953: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.59956: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.61543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.63226: done with get_vars() 16380 1727204194.63257: done queuing things up, now waiting for results queue to drain 16380 1727204194.63260: results queue empty 16380 1727204194.63260: checking for any_errors_fatal 16380 1727204194.63263: done checking for any_errors_fatal 16380 1727204194.63264: checking for max_fail_percentage 16380 1727204194.63264: done checking for max_fail_percentage 16380 1727204194.63265: checking to see if all hosts have failed and the running result is not ok 16380 1727204194.63266: done checking to see if all hosts have failed 16380 1727204194.63271: getting the remaining hosts for this loop 16380 1727204194.63272: done getting the remaining hosts for this loop 16380 1727204194.63276: getting the next task for host managed-node2 16380 1727204194.63281: done getting next task for host managed-node2 16380 1727204194.63282: ^ task is: TASK: meta (flush_handlers) 16380 1727204194.63283: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.63285: getting variables 16380 1727204194.63286: in VariableManager get_vars() 16380 1727204194.63296: Calling all_inventory to load vars for managed-node2 16380 1727204194.63298: Calling groups_inventory to load vars for managed-node2 16380 1727204194.63300: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.63306: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.63308: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.63310: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.65182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.69053: done with get_vars() 16380 1727204194.69126: done getting variables 16380 1727204194.69232: in VariableManager get_vars() 16380 1727204194.69244: Calling all_inventory to load vars for managed-node2 16380 1727204194.69248: Calling groups_inventory to load vars for managed-node2 16380 1727204194.69251: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.69257: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.69260: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.69263: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.71293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.73319: done with get_vars() 16380 1727204194.73369: done queuing things up, now waiting for results queue to drain 16380 1727204194.73372: results queue empty 16380 1727204194.73375: checking for any_errors_fatal 16380 1727204194.73377: done checking for any_errors_fatal 16380 1727204194.73378: checking for max_fail_percentage 16380 1727204194.73381: done checking for max_fail_percentage 16380 1727204194.73382: checking to see if all hosts have failed and the running result is not ok 16380 1727204194.73383: done checking to see if all hosts have failed 16380 1727204194.73384: getting the remaining hosts for this loop 16380 1727204194.73385: done getting the remaining hosts for this loop 16380 1727204194.73391: getting the next task for host managed-node2 16380 1727204194.73395: done getting next task for host managed-node2 16380 1727204194.73396: ^ task is: None 16380 1727204194.73400: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.73402: done queuing things up, now waiting for results queue to drain 16380 1727204194.73403: results queue empty 16380 1727204194.73404: checking for any_errors_fatal 16380 1727204194.73405: done checking for any_errors_fatal 16380 1727204194.73406: checking for max_fail_percentage 16380 1727204194.73407: done checking for max_fail_percentage 16380 1727204194.73408: checking to see if all hosts have failed and the running result is not ok 16380 1727204194.73409: done checking to see if all hosts have failed 16380 1727204194.73411: getting the next task for host managed-node2 16380 1727204194.73414: done getting next task for host managed-node2 16380 1727204194.73416: ^ task is: None 16380 1727204194.73417: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.73482: in VariableManager get_vars() 16380 1727204194.73508: done with get_vars() 16380 1727204194.73519: in VariableManager get_vars() 16380 1727204194.73534: done with get_vars() 16380 1727204194.73542: variable 'omit' from source: magic vars 16380 1727204194.73596: in VariableManager get_vars() 16380 1727204194.73609: done with get_vars() 16380 1727204194.73631: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 16380 1727204194.73863: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 16380 1727204194.73900: getting the remaining hosts for this loop 16380 1727204194.73902: done getting the remaining hosts for this loop 16380 1727204194.73905: getting the next task for host managed-node2 16380 1727204194.73912: done getting next task for host managed-node2 16380 1727204194.73916: ^ task is: TASK: Gathering Facts 16380 1727204194.73918: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204194.73920: getting variables 16380 1727204194.73922: in VariableManager get_vars() 16380 1727204194.73937: Calling all_inventory to load vars for managed-node2 16380 1727204194.73940: Calling groups_inventory to load vars for managed-node2 16380 1727204194.73945: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204194.73953: Calling all_plugins_play to load vars for managed-node2 16380 1727204194.73957: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204194.73961: Calling groups_plugins_play to load vars for managed-node2 16380 1727204194.75594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204194.77232: done with get_vars() 16380 1727204194.77264: done getting variables 16380 1727204194.77328: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.244) 0:00:55.880 ***** 16380 1727204194.77351: entering _queue_task() for managed-node2/gather_facts 16380 1727204194.77701: worker is 1 (out of 1 available) 16380 1727204194.77714: exiting _queue_task() for managed-node2/gather_facts 16380 1727204194.77730: done queuing things up, now waiting for results queue to drain 16380 1727204194.77732: waiting for pending results... 16380 1727204194.77938: running TaskExecutor() for managed-node2/TASK: Gathering Facts 16380 1727204194.78064: in run() - task 12b410aa-8751-749c-b6eb-0000000004fa 16380 1727204194.78078: variable 'ansible_search_path' from source: unknown 16380 1727204194.78121: calling self._execute() 16380 1727204194.78200: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.78208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.78221: variable 'omit' from source: magic vars 16380 1727204194.78560: variable 'ansible_distribution_major_version' from source: facts 16380 1727204194.78584: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204194.78587: variable 'omit' from source: magic vars 16380 1727204194.78615: variable 'omit' from source: magic vars 16380 1727204194.78649: variable 'omit' from source: magic vars 16380 1727204194.78695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204194.78728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204194.78747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204194.78766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204194.78784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204194.78814: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204194.78817: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.78824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.78915: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204194.78923: Set connection var ansible_shell_executable to /bin/sh 16380 1727204194.78931: Set connection var ansible_connection to ssh 16380 1727204194.78938: Set connection var ansible_shell_type to sh 16380 1727204194.78944: Set connection var ansible_pipelining to False 16380 1727204194.78952: Set connection var ansible_timeout to 10 16380 1727204194.78972: variable 'ansible_shell_executable' from source: unknown 16380 1727204194.78976: variable 'ansible_connection' from source: unknown 16380 1727204194.78981: variable 'ansible_module_compression' from source: unknown 16380 1727204194.78984: variable 'ansible_shell_type' from source: unknown 16380 1727204194.78986: variable 'ansible_shell_executable' from source: unknown 16380 1727204194.79000: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204194.79003: variable 'ansible_pipelining' from source: unknown 16380 1727204194.79005: variable 'ansible_timeout' from source: unknown 16380 1727204194.79007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204194.79191: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204194.79203: variable 'omit' from source: magic vars 16380 1727204194.79217: starting attempt loop 16380 1727204194.79223: running the handler 16380 1727204194.79250: variable 'ansible_facts' from source: unknown 16380 1727204194.79264: _low_level_execute_command(): starting 16380 1727204194.79272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204194.79952: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.79956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204194.79960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.80027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.80030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.80032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.80076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.81886: stdout chunk (state=3): >>>/root <<< 16380 1727204194.82004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.82100: stderr chunk (state=3): >>><<< 16380 1727204194.82103: stdout chunk (state=3): >>><<< 16380 1727204194.82130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.82156: _low_level_execute_command(): starting 16380 1727204194.82161: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308 `" && echo ansible-tmp-1727204194.8212812-19926-263434210452308="` echo /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308 `" ) && sleep 0' 16380 1727204194.82829: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 16380 1727204194.82833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204194.82836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204194.82845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204194.82847: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.82898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.82909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.82949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.85025: stdout chunk (state=3): >>>ansible-tmp-1727204194.8212812-19926-263434210452308=/root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308 <<< 16380 1727204194.85143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.85201: stderr chunk (state=3): >>><<< 16380 1727204194.85205: stdout chunk (state=3): >>><<< 16380 1727204194.85227: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204194.8212812-19926-263434210452308=/root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.85257: variable 'ansible_module_compression' from source: unknown 16380 1727204194.85305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 16380 1727204194.85371: variable 'ansible_facts' from source: unknown 16380 1727204194.85496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py 16380 1727204194.85631: Sending initial data 16380 1727204194.85635: Sent initial data (154 bytes) 16380 1727204194.86114: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.86118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.86120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204194.86123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.86179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.86183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.86231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.87919: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204194.87956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204194.88000: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpxneu_87m /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py <<< 16380 1727204194.88003: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py" <<< 16380 1727204194.88037: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmpxneu_87m" to remote "/root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py" <<< 16380 1727204194.88044: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py" <<< 16380 1727204194.89688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.89766: stderr chunk (state=3): >>><<< 16380 1727204194.89769: stdout chunk (state=3): >>><<< 16380 1727204194.89793: done transferring module to remote 16380 1727204194.89804: _low_level_execute_command(): starting 16380 1727204194.89811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/ /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py && sleep 0' 16380 1727204194.90305: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.90309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.90312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 16380 1727204194.90314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 16380 1727204194.90316: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.90381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204194.90416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.90419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.90450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204194.92427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204194.92481: stderr chunk (state=3): >>><<< 16380 1727204194.92484: stdout chunk (state=3): >>><<< 16380 1727204194.92502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204194.92505: _low_level_execute_command(): starting 16380 1727204194.92511: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/AnsiballZ_setup.py && sleep 0' 16380 1727204194.93081: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.93085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204194.93087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.93090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204194.93098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204194.93181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204194.93204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204194.93262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204195.63476: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2838, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 879, "free": 2838}, "nocache": {"free": 3467, "used": 250}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible<<< 16380 1727204195.63491: stdout chunk (state=3): >>>_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 699, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147390976, "block_size": 4096, "block_total": 64479564, "block_available": 61315281, "block_used": 3164283, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "35", "epoch": "1727204195", "epoch_int": "1727204195", "date": "2024-09-24", "time": "14:56:35", "iso8601_micro": "2024-09-24T18:56:35.596200Z", "iso8601": "2024-09-24T18:56:35Z", "iso8601_basic": "20240924T145635596200", "iso8601_basic_short": "20240924T145635", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.73876953125, "5m": 0.5849609375, "15m": 0.36669921875}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 16380 1727204195.65865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204195.65999: stderr chunk (state=3): >>><<< 16380 1727204195.66007: stdout chunk (state=3): >>><<< 16380 1727204195.66012: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2838, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 879, "free": 2838}, "nocache": {"free": 3467, "used": 250}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 699, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147390976, "block_size": 4096, "block_total": 64479564, "block_available": 61315281, "block_used": 3164283, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "35", "epoch": "1727204195", "epoch_int": "1727204195", "date": "2024-09-24", "time": "14:56:35", "iso8601_micro": "2024-09-24T18:56:35.596200Z", "iso8601": "2024-09-24T18:56:35Z", "iso8601_basic": "20240924T145635596200", "iso8601_basic_short": "20240924T145635", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.73876953125, "5m": 0.5849609375, "15m": 0.36669921875}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204195.66459: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204195.66506: _low_level_execute_command(): starting 16380 1727204195.66520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204194.8212812-19926-263434210452308/ > /dev/null 2>&1 && sleep 0' 16380 1727204195.67302: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 16380 1727204195.67332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204195.67362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204195.67543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204195.69609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204195.69624: stdout chunk (state=3): >>><<< 16380 1727204195.69638: stderr chunk (state=3): >>><<< 16380 1727204195.69661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204195.69679: handler run complete 16380 1727204195.69986: variable 'ansible_facts' from source: unknown 16380 1727204195.70237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204195.70729: variable 'ansible_facts' from source: unknown 16380 1727204195.70799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204195.70905: attempt loop complete, returning result 16380 1727204195.70911: _execute() done 16380 1727204195.70914: dumping result to json 16380 1727204195.70937: done dumping result, returning 16380 1727204195.70947: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-749c-b6eb-0000000004fa] 16380 1727204195.71130: sending task result for task 12b410aa-8751-749c-b6eb-0000000004fa ok: [managed-node2] 16380 1727204195.72151: no more pending results, returning what we have 16380 1727204195.72155: results queue empty 16380 1727204195.72157: checking for any_errors_fatal 16380 1727204195.72158: done checking for any_errors_fatal 16380 1727204195.72159: checking for max_fail_percentage 16380 1727204195.72161: done checking for max_fail_percentage 16380 1727204195.72162: checking to see if all hosts have failed and the running result is not ok 16380 1727204195.72163: done checking to see if all hosts have failed 16380 1727204195.72164: getting the remaining hosts for this loop 16380 1727204195.72166: done getting the remaining hosts for this loop 16380 1727204195.72170: getting the next task for host managed-node2 16380 1727204195.72176: done getting next task for host managed-node2 16380 1727204195.72178: ^ task is: TASK: meta (flush_handlers) 16380 1727204195.72180: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204195.72185: getting variables 16380 1727204195.72187: in VariableManager get_vars() 16380 1727204195.72497: done sending task result for task 12b410aa-8751-749c-b6eb-0000000004fa 16380 1727204195.72501: WORKER PROCESS EXITING 16380 1727204195.72531: Calling all_inventory to load vars for managed-node2 16380 1727204195.72535: Calling groups_inventory to load vars for managed-node2 16380 1727204195.72539: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204195.72551: Calling all_plugins_play to load vars for managed-node2 16380 1727204195.72554: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204195.72557: Calling groups_plugins_play to load vars for managed-node2 16380 1727204195.78247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204195.82424: done with get_vars() 16380 1727204195.82470: done getting variables 16380 1727204195.82562: in VariableManager get_vars() 16380 1727204195.82576: Calling all_inventory to load vars for managed-node2 16380 1727204195.82579: Calling groups_inventory to load vars for managed-node2 16380 1727204195.82582: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204195.82588: Calling all_plugins_play to load vars for managed-node2 16380 1727204195.82594: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204195.82598: Calling groups_plugins_play to load vars for managed-node2 16380 1727204195.84732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204195.87732: done with get_vars() 16380 1727204195.87783: done queuing things up, now waiting for results queue to drain 16380 1727204195.87785: results queue empty 16380 1727204195.87786: checking for any_errors_fatal 16380 1727204195.87795: done checking for any_errors_fatal 16380 1727204195.87796: checking for max_fail_percentage 16380 1727204195.87797: done checking for max_fail_percentage 16380 1727204195.87799: checking to see if all hosts have failed and the running result is not ok 16380 1727204195.87800: done checking to see if all hosts have failed 16380 1727204195.87801: getting the remaining hosts for this loop 16380 1727204195.87806: done getting the remaining hosts for this loop 16380 1727204195.87810: getting the next task for host managed-node2 16380 1727204195.87816: done getting next task for host managed-node2 16380 1727204195.87821: ^ task is: TASK: Verify network state restored to default 16380 1727204195.87823: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204195.87826: getting variables 16380 1727204195.87828: in VariableManager get_vars() 16380 1727204195.87840: Calling all_inventory to load vars for managed-node2 16380 1727204195.87843: Calling groups_inventory to load vars for managed-node2 16380 1727204195.87846: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204195.87853: Calling all_plugins_play to load vars for managed-node2 16380 1727204195.87856: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204195.87860: Calling groups_plugins_play to load vars for managed-node2 16380 1727204195.97151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.00778: done with get_vars() 16380 1727204196.00824: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Tuesday 24 September 2024 14:56:36 -0400 (0:00:01.235) 0:00:57.115 ***** 16380 1727204196.00916: entering _queue_task() for managed-node2/include_tasks 16380 1727204196.01382: worker is 1 (out of 1 available) 16380 1727204196.01399: exiting _queue_task() for managed-node2/include_tasks 16380 1727204196.01414: done queuing things up, now waiting for results queue to drain 16380 1727204196.01416: waiting for pending results... 16380 1727204196.01733: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 16380 1727204196.01866: in run() - task 12b410aa-8751-749c-b6eb-00000000007a 16380 1727204196.01896: variable 'ansible_search_path' from source: unknown 16380 1727204196.02005: calling self._execute() 16380 1727204196.02200: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204196.02217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204196.02235: variable 'omit' from source: magic vars 16380 1727204196.02723: variable 'ansible_distribution_major_version' from source: facts 16380 1727204196.02745: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204196.02758: _execute() done 16380 1727204196.02767: dumping result to json 16380 1727204196.02776: done dumping result, returning 16380 1727204196.02996: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [12b410aa-8751-749c-b6eb-00000000007a] 16380 1727204196.02999: sending task result for task 12b410aa-8751-749c-b6eb-00000000007a 16380 1727204196.03077: done sending task result for task 12b410aa-8751-749c-b6eb-00000000007a 16380 1727204196.03081: WORKER PROCESS EXITING 16380 1727204196.03114: no more pending results, returning what we have 16380 1727204196.03119: in VariableManager get_vars() 16380 1727204196.03160: Calling all_inventory to load vars for managed-node2 16380 1727204196.03164: Calling groups_inventory to load vars for managed-node2 16380 1727204196.03168: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.03184: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.03188: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.03195: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.05777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.11031: done with get_vars() 16380 1727204196.11071: variable 'ansible_search_path' from source: unknown 16380 1727204196.11191: we have included files to process 16380 1727204196.11194: generating all_blocks data 16380 1727204196.11196: done generating all_blocks data 16380 1727204196.11198: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 16380 1727204196.11199: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 16380 1727204196.11203: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 16380 1727204196.11996: done processing included file 16380 1727204196.11999: iterating over new_blocks loaded from include file 16380 1727204196.12001: in VariableManager get_vars() 16380 1727204196.12016: done with get_vars() 16380 1727204196.12018: filtering new block on tags 16380 1727204196.12040: done filtering new block on tags 16380 1727204196.12044: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 16380 1727204196.12050: extending task lists for all hosts with included blocks 16380 1727204196.12090: done extending task lists 16380 1727204196.12092: done processing included files 16380 1727204196.12093: results queue empty 16380 1727204196.12094: checking for any_errors_fatal 16380 1727204196.12096: done checking for any_errors_fatal 16380 1727204196.12097: checking for max_fail_percentage 16380 1727204196.12099: done checking for max_fail_percentage 16380 1727204196.12100: checking to see if all hosts have failed and the running result is not ok 16380 1727204196.12100: done checking to see if all hosts have failed 16380 1727204196.12101: getting the remaining hosts for this loop 16380 1727204196.12103: done getting the remaining hosts for this loop 16380 1727204196.12106: getting the next task for host managed-node2 16380 1727204196.12112: done getting next task for host managed-node2 16380 1727204196.12114: ^ task is: TASK: Check routes and DNS 16380 1727204196.12117: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204196.12120: getting variables 16380 1727204196.12121: in VariableManager get_vars() 16380 1727204196.12132: Calling all_inventory to load vars for managed-node2 16380 1727204196.12135: Calling groups_inventory to load vars for managed-node2 16380 1727204196.12138: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.12144: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.12147: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.12151: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.14887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.18471: done with get_vars() 16380 1727204196.18571: done getting variables 16380 1727204196.18672: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.178) 0:00:57.294 ***** 16380 1727204196.18763: entering _queue_task() for managed-node2/shell 16380 1727204196.19209: worker is 1 (out of 1 available) 16380 1727204196.19225: exiting _queue_task() for managed-node2/shell 16380 1727204196.19240: done queuing things up, now waiting for results queue to drain 16380 1727204196.19243: waiting for pending results... 16380 1727204196.20248: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 16380 1727204196.20954: in run() - task 12b410aa-8751-749c-b6eb-00000000050b 16380 1727204196.21074: variable 'ansible_search_path' from source: unknown 16380 1727204196.21138: variable 'ansible_search_path' from source: unknown 16380 1727204196.21182: calling self._execute() 16380 1727204196.21496: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204196.21520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204196.21609: variable 'omit' from source: magic vars 16380 1727204196.22017: variable 'ansible_distribution_major_version' from source: facts 16380 1727204196.22024: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204196.22030: variable 'omit' from source: magic vars 16380 1727204196.22091: variable 'omit' from source: magic vars 16380 1727204196.22206: variable 'omit' from source: magic vars 16380 1727204196.22210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 16380 1727204196.22228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 16380 1727204196.22247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 16380 1727204196.22265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204196.22276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 16380 1727204196.22308: variable 'inventory_hostname' from source: host vars for 'managed-node2' 16380 1727204196.22312: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204196.22315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204196.22426: Set connection var ansible_module_compression to ZIP_DEFLATED 16380 1727204196.22433: Set connection var ansible_shell_executable to /bin/sh 16380 1727204196.22442: Set connection var ansible_connection to ssh 16380 1727204196.22449: Set connection var ansible_shell_type to sh 16380 1727204196.22455: Set connection var ansible_pipelining to False 16380 1727204196.22464: Set connection var ansible_timeout to 10 16380 1727204196.22487: variable 'ansible_shell_executable' from source: unknown 16380 1727204196.22490: variable 'ansible_connection' from source: unknown 16380 1727204196.22495: variable 'ansible_module_compression' from source: unknown 16380 1727204196.22497: variable 'ansible_shell_type' from source: unknown 16380 1727204196.22501: variable 'ansible_shell_executable' from source: unknown 16380 1727204196.22503: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204196.22507: variable 'ansible_pipelining' from source: unknown 16380 1727204196.22511: variable 'ansible_timeout' from source: unknown 16380 1727204196.22517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204196.22656: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204196.22666: variable 'omit' from source: magic vars 16380 1727204196.22675: starting attempt loop 16380 1727204196.22679: running the handler 16380 1727204196.22695: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 16380 1727204196.22795: _low_level_execute_command(): starting 16380 1727204196.22798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 16380 1727204196.23363: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 16380 1727204196.23381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.23452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204196.23514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204196.23564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204196.25359: stdout chunk (state=3): >>>/root <<< 16380 1727204196.25468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204196.25526: stderr chunk (state=3): >>><<< 16380 1727204196.25530: stdout chunk (state=3): >>><<< 16380 1727204196.25553: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204196.25564: _low_level_execute_command(): starting 16380 1727204196.25571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416 `" && echo ansible-tmp-1727204196.2555196-19965-90299282181416="` echo /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416 `" ) && sleep 0' 16380 1727204196.26022: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.26025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.26049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.26052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.26119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204196.26122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204196.26209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204196.28258: stdout chunk (state=3): >>>ansible-tmp-1727204196.2555196-19965-90299282181416=/root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416 <<< 16380 1727204196.28440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204196.28444: stdout chunk (state=3): >>><<< 16380 1727204196.28453: stderr chunk (state=3): >>><<< 16380 1727204196.28469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204196.2555196-19965-90299282181416=/root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204196.28499: variable 'ansible_module_compression' from source: unknown 16380 1727204196.28544: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-16380dj4dhlpm/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 16380 1727204196.28634: variable 'ansible_facts' from source: unknown 16380 1727204196.28709: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py 16380 1727204196.28824: Sending initial data 16380 1727204196.28827: Sent initial data (155 bytes) 16380 1727204196.29384: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204196.29387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204196.29422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204196.31174: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 16380 1727204196.31212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 16380 1727204196.31251: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp9nl344yx /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py <<< 16380 1727204196.31275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py" <<< 16380 1727204196.31312: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-16380dj4dhlpm/tmp9nl344yx" to remote "/root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py" <<< 16380 1727204196.32466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204196.32607: stderr chunk (state=3): >>><<< 16380 1727204196.32613: stdout chunk (state=3): >>><<< 16380 1727204196.32617: done transferring module to remote 16380 1727204196.32620: _low_level_execute_command(): starting 16380 1727204196.32623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/ /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py && sleep 0' 16380 1727204196.33268: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.33272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.33275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 16380 1727204196.33279: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.33282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.33348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204196.33393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204196.35413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204196.35440: stderr chunk (state=3): >>><<< 16380 1727204196.35454: stdout chunk (state=3): >>><<< 16380 1727204196.35466: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204196.35470: _low_level_execute_command(): starting 16380 1727204196.35476: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/AnsiballZ_command.py && sleep 0' 16380 1727204196.35996: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.36020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.36024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.36085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 16380 1727204196.36094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204196.36134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204196.54861: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3307sec preferred_lft 3307sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:36.538415", "end": "2024-09-24 14:56:36.547554", "delta": "0:00:00.009139", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 16380 1727204196.56634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 16380 1727204196.56742: stderr chunk (state=3): >>><<< 16380 1727204196.56756: stdout chunk (state=3): >>><<< 16380 1727204196.56776: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3307sec preferred_lft 3307sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:36.538415", "end": "2024-09-24 14:56:36.547554", "delta": "0:00:00.009139", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 16380 1727204196.56845: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 16380 1727204196.56854: _low_level_execute_command(): starting 16380 1727204196.56863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204196.2555196-19965-90299282181416/ > /dev/null 2>&1 && sleep 0' 16380 1727204196.57483: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 16380 1727204196.57487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 16380 1727204196.57524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 16380 1727204196.57528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 16380 1727204196.57530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 16380 1727204196.57597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 16380 1727204196.57605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 16380 1727204196.57647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 16380 1727204196.59627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 16380 1727204196.59697: stderr chunk (state=3): >>><<< 16380 1727204196.59701: stdout chunk (state=3): >>><<< 16380 1727204196.59736: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 16380 1727204196.59739: handler run complete 16380 1727204196.59758: Evaluated conditional (False): False 16380 1727204196.59831: attempt loop complete, returning result 16380 1727204196.59834: _execute() done 16380 1727204196.59837: dumping result to json 16380 1727204196.59839: done dumping result, returning 16380 1727204196.59841: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [12b410aa-8751-749c-b6eb-00000000050b] 16380 1727204196.59843: sending task result for task 12b410aa-8751-749c-b6eb-00000000050b 16380 1727204196.59947: done sending task result for task 12b410aa-8751-749c-b6eb-00000000050b 16380 1727204196.59951: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009139", "end": "2024-09-24 14:56:36.547554", "rc": 0, "start": "2024-09-24 14:56:36.538415" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3307sec preferred_lft 3307sec inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 16380 1727204196.60045: no more pending results, returning what we have 16380 1727204196.60049: results queue empty 16380 1727204196.60050: checking for any_errors_fatal 16380 1727204196.60052: done checking for any_errors_fatal 16380 1727204196.60053: checking for max_fail_percentage 16380 1727204196.60055: done checking for max_fail_percentage 16380 1727204196.60055: checking to see if all hosts have failed and the running result is not ok 16380 1727204196.60056: done checking to see if all hosts have failed 16380 1727204196.60057: getting the remaining hosts for this loop 16380 1727204196.60059: done getting the remaining hosts for this loop 16380 1727204196.60071: getting the next task for host managed-node2 16380 1727204196.60078: done getting next task for host managed-node2 16380 1727204196.60080: ^ task is: TASK: Verify DNS and network connectivity 16380 1727204196.60088: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204196.60094: getting variables 16380 1727204196.60096: in VariableManager get_vars() 16380 1727204196.60127: Calling all_inventory to load vars for managed-node2 16380 1727204196.60130: Calling groups_inventory to load vars for managed-node2 16380 1727204196.60134: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.60146: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.60149: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.60152: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.61881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.63976: done with get_vars() 16380 1727204196.64013: done getting variables 16380 1727204196.64069: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.453) 0:00:57.747 ***** 16380 1727204196.64097: entering _queue_task() for managed-node2/shell 16380 1727204196.64409: worker is 1 (out of 1 available) 16380 1727204196.64427: exiting _queue_task() for managed-node2/shell 16380 1727204196.64445: done queuing things up, now waiting for results queue to drain 16380 1727204196.64448: waiting for pending results... 16380 1727204196.64760: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 16380 1727204196.64799: in run() - task 12b410aa-8751-749c-b6eb-00000000050c 16380 1727204196.64813: variable 'ansible_search_path' from source: unknown 16380 1727204196.64816: variable 'ansible_search_path' from source: unknown 16380 1727204196.64854: calling self._execute() 16380 1727204196.64947: variable 'ansible_host' from source: host vars for 'managed-node2' 16380 1727204196.64953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 16380 1727204196.64965: variable 'omit' from source: magic vars 16380 1727204196.65355: variable 'ansible_distribution_major_version' from source: facts 16380 1727204196.65399: Evaluated conditional (ansible_distribution_major_version != '6'): True 16380 1727204196.65529: variable 'ansible_facts' from source: unknown 16380 1727204196.66236: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 16380 1727204196.66239: when evaluation is False, skipping this task 16380 1727204196.66242: _execute() done 16380 1727204196.66246: dumping result to json 16380 1727204196.66251: done dumping result, returning 16380 1727204196.66260: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [12b410aa-8751-749c-b6eb-00000000050c] 16380 1727204196.66270: sending task result for task 12b410aa-8751-749c-b6eb-00000000050c 16380 1727204196.66369: done sending task result for task 12b410aa-8751-749c-b6eb-00000000050c 16380 1727204196.66372: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 16380 1727204196.66429: no more pending results, returning what we have 16380 1727204196.66432: results queue empty 16380 1727204196.66433: checking for any_errors_fatal 16380 1727204196.66446: done checking for any_errors_fatal 16380 1727204196.66447: checking for max_fail_percentage 16380 1727204196.66449: done checking for max_fail_percentage 16380 1727204196.66450: checking to see if all hosts have failed and the running result is not ok 16380 1727204196.66451: done checking to see if all hosts have failed 16380 1727204196.66452: getting the remaining hosts for this loop 16380 1727204196.66454: done getting the remaining hosts for this loop 16380 1727204196.66458: getting the next task for host managed-node2 16380 1727204196.66467: done getting next task for host managed-node2 16380 1727204196.66470: ^ task is: TASK: meta (flush_handlers) 16380 1727204196.66472: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204196.66476: getting variables 16380 1727204196.66478: in VariableManager get_vars() 16380 1727204196.66511: Calling all_inventory to load vars for managed-node2 16380 1727204196.66515: Calling groups_inventory to load vars for managed-node2 16380 1727204196.66521: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.66533: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.66536: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.66539: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.68027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.70017: done with get_vars() 16380 1727204196.70053: done getting variables 16380 1727204196.70121: in VariableManager get_vars() 16380 1727204196.70133: Calling all_inventory to load vars for managed-node2 16380 1727204196.70136: Calling groups_inventory to load vars for managed-node2 16380 1727204196.70141: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.70148: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.70151: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.70155: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.71677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.73410: done with get_vars() 16380 1727204196.73436: done queuing things up, now waiting for results queue to drain 16380 1727204196.73437: results queue empty 16380 1727204196.73438: checking for any_errors_fatal 16380 1727204196.73440: done checking for any_errors_fatal 16380 1727204196.73441: checking for max_fail_percentage 16380 1727204196.73442: done checking for max_fail_percentage 16380 1727204196.73442: checking to see if all hosts have failed and the running result is not ok 16380 1727204196.73443: done checking to see if all hosts have failed 16380 1727204196.73444: getting the remaining hosts for this loop 16380 1727204196.73444: done getting the remaining hosts for this loop 16380 1727204196.73446: getting the next task for host managed-node2 16380 1727204196.73450: done getting next task for host managed-node2 16380 1727204196.73451: ^ task is: TASK: meta (flush_handlers) 16380 1727204196.73452: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204196.73455: getting variables 16380 1727204196.73455: in VariableManager get_vars() 16380 1727204196.73461: Calling all_inventory to load vars for managed-node2 16380 1727204196.73463: Calling groups_inventory to load vars for managed-node2 16380 1727204196.73465: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.73469: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.73471: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.73473: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.74850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.77503: done with get_vars() 16380 1727204196.77540: done getting variables 16380 1727204196.77604: in VariableManager get_vars() 16380 1727204196.77616: Calling all_inventory to load vars for managed-node2 16380 1727204196.77621: Calling groups_inventory to load vars for managed-node2 16380 1727204196.77624: Calling all_plugins_inventory to load vars for managed-node2 16380 1727204196.77630: Calling all_plugins_play to load vars for managed-node2 16380 1727204196.77633: Calling groups_plugins_inventory to load vars for managed-node2 16380 1727204196.77637: Calling groups_plugins_play to load vars for managed-node2 16380 1727204196.79576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 16380 1727204196.81960: done with get_vars() 16380 1727204196.82004: done queuing things up, now waiting for results queue to drain 16380 1727204196.82006: results queue empty 16380 1727204196.82008: checking for any_errors_fatal 16380 1727204196.82009: done checking for any_errors_fatal 16380 1727204196.82010: checking for max_fail_percentage 16380 1727204196.82012: done checking for max_fail_percentage 16380 1727204196.82013: checking to see if all hosts have failed and the running result is not ok 16380 1727204196.82014: done checking to see if all hosts have failed 16380 1727204196.82014: getting the remaining hosts for this loop 16380 1727204196.82016: done getting the remaining hosts for this loop 16380 1727204196.82028: getting the next task for host managed-node2 16380 1727204196.82032: done getting next task for host managed-node2 16380 1727204196.82033: ^ task is: None 16380 1727204196.82035: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 16380 1727204196.82036: done queuing things up, now waiting for results queue to drain 16380 1727204196.82038: results queue empty 16380 1727204196.82039: checking for any_errors_fatal 16380 1727204196.82040: done checking for any_errors_fatal 16380 1727204196.82041: checking for max_fail_percentage 16380 1727204196.82042: done checking for max_fail_percentage 16380 1727204196.82043: checking to see if all hosts have failed and the running result is not ok 16380 1727204196.82044: done checking to see if all hosts have failed 16380 1727204196.82045: getting the next task for host managed-node2 16380 1727204196.82048: done getting next task for host managed-node2 16380 1727204196.82049: ^ task is: None 16380 1727204196.82050: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=81 changed=3 unreachable=0 failed=0 skipped=72 rescued=0 ignored=2 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.180) 0:00:57.927 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 3.38s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.52s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.51s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 2.29s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 2.11s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.59s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.39s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Gathering Facts --------------------------------------------------------- 1.32s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.31s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.27s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.26s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.24s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 fedora.linux_system_roles.network : Check which packages are installed --- 1.22s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.20s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.15s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gather current interface info ------------------------------------------- 0.99s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 16380 1727204196.82175: RUNNING CLEANUP